Apr 20 17:48:01.113046 ip-10-0-138-9 systemd[1]: Starting Kubernetes Kubelet... Apr 20 17:48:01.514155 ip-10-0-138-9 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:01.514155 ip-10-0-138-9 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 17:48:01.514155 ip-10-0-138-9 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:01.514155 ip-10-0-138-9 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 17:48:01.514155 ip-10-0-138-9 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:01.515003 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.514912 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 17:48:01.517278 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517262 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:01.517278 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517277 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517281 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517284 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517287 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517290 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517293 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517296 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517299 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517302 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517305 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517307 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517311 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517314 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517317 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517320 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517322 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517325 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517327 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517330 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517332 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:01.517348 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517335 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517338 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517340 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517343 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517345 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517350 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517354 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517357 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517360 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517363 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517366 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517369 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517371 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517374 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517376 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517379 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517382 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517384 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517387 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517390 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:01.517854 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517392 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517395 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517398 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517401 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517404 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517406 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517409 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517411 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517414 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517416 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517419 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517422 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517425 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517427 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517431 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517434 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517437 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517439 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517442 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517445 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:01.518340 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517448 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517450 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517453 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517455 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517460 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517464 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517467 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517469 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517472 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517475 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517477 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517480 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517483 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517485 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517488 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517491 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517493 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517496 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517499 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:01.518882 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517501 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517504 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517506 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517509 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517520 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517524 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517935 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517942 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517946 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517949 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517953 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517956 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517959 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517961 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517964 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517967 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517970 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517973 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517975 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517978 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:01.519343 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517981 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517983 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517986 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517988 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517992 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517994 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.517997 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518000 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518003 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518006 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518008 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518011 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518013 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518016 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518018 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518021 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518023 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518027 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518029 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518032 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:01.519853 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518034 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518037 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518040 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518042 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518045 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518048 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518050 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518053 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518055 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518058 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518060 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518063 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518065 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518068 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518070 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518073 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518077 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518079 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518083 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518085 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:01.520345 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518094 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518097 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518100 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518102 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518104 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518107 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518110 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518113 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518115 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518118 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518121 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518124 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518126 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518129 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518131 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518135 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518140 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518143 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518146 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518149 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:01.520944 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518151 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518154 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518156 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518159 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518161 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518164 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518166 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518169 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518172 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518174 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518177 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.518180 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519583 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519595 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519603 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519608 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519612 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519615 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519632 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519637 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 17:48:01.521443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519640 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519643 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519647 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519651 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519655 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519658 2581 flags.go:64] FLAG: --cgroup-root="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519661 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519664 2581 flags.go:64] FLAG: --client-ca-file="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519667 2581 flags.go:64] FLAG: --cloud-config="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519670 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519673 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519678 2581 flags.go:64] FLAG: --cluster-domain="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519681 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519684 2581 flags.go:64] FLAG: --config-dir="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519687 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519691 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519695 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519698 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519701 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519704 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519708 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519711 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519714 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519718 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519721 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 17:48:01.521979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519725 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519729 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519732 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519735 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519738 2581 flags.go:64] FLAG: --enable-server="true" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519741 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519746 2581 flags.go:64] FLAG: --event-burst="100" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519749 2581 flags.go:64] FLAG: --event-qps="50" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519753 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519756 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519759 2581 flags.go:64] FLAG: --eviction-hard="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519763 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519767 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519770 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519773 2581 flags.go:64] FLAG: --eviction-soft="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519776 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519780 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519782 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519786 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519789 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519792 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519795 2581 flags.go:64] FLAG: --feature-gates="" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519799 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519802 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519806 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 17:48:01.522574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519809 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519812 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519815 2581 flags.go:64] FLAG: --help="false" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519818 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519821 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519825 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519828 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519832 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519835 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519839 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519842 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519845 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519848 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519851 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519854 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519857 2581 flags.go:64] FLAG: --kube-reserved="" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519861 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519864 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519867 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519870 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519874 2581 flags.go:64] FLAG: --lock-file="" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519877 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519880 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519883 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 17:48:01.523197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519888 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519891 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519894 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519897 2581 flags.go:64] FLAG: --logging-format="text" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519900 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519903 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519906 2581 flags.go:64] FLAG: --manifest-url="" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519909 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519917 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519920 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519924 2581 flags.go:64] FLAG: --max-pods="110" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519927 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519930 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519933 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519936 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519941 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519944 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519948 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519956 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519959 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519963 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519966 2581 flags.go:64] FLAG: --pod-cidr="" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519969 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 17:48:01.523829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519974 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519981 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519984 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519987 2581 flags.go:64] FLAG: --port="10250" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519990 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519993 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0592c22eae33f07b7" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.519997 2581 flags.go:64] FLAG: --qos-reserved="" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520000 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520003 2581 flags.go:64] FLAG: --register-node="true" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520006 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520009 2581 flags.go:64] FLAG: --register-with-taints="" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520013 2581 flags.go:64] FLAG: --registry-burst="10" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520016 2581 flags.go:64] FLAG: --registry-qps="5" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520019 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520022 2581 flags.go:64] FLAG: --reserved-memory="" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520026 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520029 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520032 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520035 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520037 2581 flags.go:64] FLAG: --runonce="false" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520040 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520043 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520047 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520049 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520053 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520057 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 17:48:01.524377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520060 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520063 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520066 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520069 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520072 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520075 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520078 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520082 2581 flags.go:64] FLAG: --system-cgroups="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520085 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520091 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520094 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520097 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520101 2581 flags.go:64] FLAG: --tls-min-version="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520104 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520107 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520110 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520112 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520115 2581 flags.go:64] FLAG: --v="2" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520120 2581 flags.go:64] FLAG: --version="false" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520124 2581 flags.go:64] FLAG: --vmodule="" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520129 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.520132 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521688 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521695 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:01.525020 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521698 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521702 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521706 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521709 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521712 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521715 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521719 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521723 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521726 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521730 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521732 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521735 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521738 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521741 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521744 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521747 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521750 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521753 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521755 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521758 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:01.525602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521761 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521763 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521768 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521771 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521774 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521778 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521780 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521783 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521786 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521789 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521791 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521794 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521796 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521799 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521802 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521804 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521807 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521809 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521812 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:01.526119 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521815 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521818 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521820 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521823 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521825 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521829 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521832 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521834 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521837 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521840 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521843 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521845 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521848 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521850 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521853 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521855 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521858 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521860 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521863 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521866 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:01.526586 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521869 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521871 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521874 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521876 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521879 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521882 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521885 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521887 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521890 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521893 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521895 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521898 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521901 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521904 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521907 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521909 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521912 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521915 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521918 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521920 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:01.527117 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521923 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:01.527657 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521926 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:01.527657 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521929 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:01.527657 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521931 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:01.527657 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.521934 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:01.527657 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.522671 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:01.529838 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.529810 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 17:48:01.529838 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.529834 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529908 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529916 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529921 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529928 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529934 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529939 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529943 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529948 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529952 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529957 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529961 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529964 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529968 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529973 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529977 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529981 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529986 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529990 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:01.530019 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529995 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.529999 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530003 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530007 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530014 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530018 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530022 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530029 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530035 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530041 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530046 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530050 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530055 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530060 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530064 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530068 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530073 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530077 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530080 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:01.530841 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530085 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530089 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530094 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530098 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530102 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530106 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530110 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530114 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530118 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530122 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530126 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530130 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530134 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530139 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530143 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530148 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530153 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530157 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530163 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530167 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:01.531456 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530172 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530176 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530180 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530185 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530189 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530194 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530198 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530202 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530206 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530210 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530215 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530220 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530224 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530228 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530232 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530236 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530240 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530244 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530248 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530252 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:01.532065 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530257 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530261 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530265 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530270 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530274 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530278 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530282 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530287 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530292 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.530301 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530551 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530571 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530578 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530583 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530587 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530591 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:01.532783 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530596 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530601 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530605 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530609 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530613 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530618 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530639 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530644 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530648 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530652 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530656 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530660 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530664 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530669 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530672 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530677 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530680 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530684 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530689 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530693 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:01.533328 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530697 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530701 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530707 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530715 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530720 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530725 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530729 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530734 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530746 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530751 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530755 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530759 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530763 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530767 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530772 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530776 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530781 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530785 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530790 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530795 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:01.533906 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530799 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530803 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530807 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530811 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530816 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530820 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530824 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530829 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530833 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530837 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530841 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530845 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530848 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530852 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530856 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530861 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530865 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530869 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530876 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:01.534410 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530882 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530888 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530900 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530906 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530910 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530915 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530919 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530924 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530928 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530932 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530937 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530941 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530945 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530949 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530953 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530958 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530962 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530966 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530970 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530974 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:01.534907 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:01.530978 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:01.535403 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.530986 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:01.535403 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.531829 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 17:48:01.537424 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.537403 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 17:48:01.538339 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.538326 2581 server.go:1019] "Starting client certificate rotation" Apr 20 17:48:01.538475 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.538455 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:01.538516 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.538498 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:01.561083 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.561051 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:01.565834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.565792 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:01.579882 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.579858 2581 log.go:25] "Validated CRI v1 runtime API" Apr 20 17:48:01.586007 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.585983 2581 log.go:25] "Validated CRI v1 image API" Apr 20 17:48:01.588104 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.588089 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 17:48:01.591865 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.591158 2581 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 87c2baa9-d1e4-4a75-a9e6-c53ec11ed4ba:/dev/nvme0n1p4 996f760f-91c7-4d00-8625-f8ab31f0545f:/dev/nvme0n1p3] Apr 20 17:48:01.591989 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.591862 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 17:48:01.593812 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.593794 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:01.597683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.597553 2581 manager.go:217] Machine: {Timestamp:2026-04-20 17:48:01.595664044 +0000 UTC m=+0.371194829 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100247 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a49aa0ad8e283020451f181326371 SystemUUID:ec2a49aa-0ad8-e283-0204-51f181326371 BootID:080c8b0a-bb95-42ee-b733-ee2c8c895db5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3f:22:7a:ed:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3f:22:7a:ed:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:3b:93:21:83:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 17:48:01.597683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.597679 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 17:48:01.597818 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.597806 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 17:48:01.600158 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.600130 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 17:48:01.600311 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.600159 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-9.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 17:48:01.600352 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.600321 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 17:48:01.600352 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.600329 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 17:48:01.600352 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.600343 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:01.601041 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.601031 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:01.602601 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.602590 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:01.602782 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.602773 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 17:48:01.604928 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.604917 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 20 17:48:01.604974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.604939 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 17:48:01.604974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.604951 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 17:48:01.604974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.604964 2581 kubelet.go:397] "Adding apiserver pod source" Apr 20 17:48:01.604974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.604973 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 17:48:01.606035 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.606019 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:01.606080 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.606048 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:01.609436 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.609392 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 17:48:01.611283 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.611265 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 17:48:01.613193 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613176 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613202 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613215 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613225 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613233 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613241 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613249 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613257 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 17:48:01.613268 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613266 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 17:48:01.613531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613280 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 17:48:01.613531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613294 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 17:48:01.613531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.613305 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 17:48:01.614087 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.614076 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 17:48:01.614087 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.614088 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 17:48:01.617928 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.617912 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 17:48:01.618022 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.617951 2581 server.go:1295] "Started kubelet" Apr 20 17:48:01.618084 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.618046 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 17:48:01.618160 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.618110 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 17:48:01.618214 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.618184 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 17:48:01.618644 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.618614 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-9.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 17:48:01.618713 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.618670 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 17:48:01.618755 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.618738 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 17:48:01.618981 ip-10-0-138-9 systemd[1]: Started Kubernetes Kubelet. Apr 20 17:48:01.620741 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.620725 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 20 17:48:01.622136 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.622118 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 17:48:01.627264 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.626465 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-9.ec2.internal.18a821dbf6afb069 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-9.ec2.internal,UID:ip-10-0-138-9.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-9.ec2.internal,},FirstTimestamp:2026-04-20 17:48:01.617924201 +0000 UTC m=+0.393454987,LastTimestamp:2026-04-20 17:48:01.617924201 +0000 UTC m=+0.393454987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-9.ec2.internal,}" Apr 20 17:48:01.628344 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.628320 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 17:48:01.628344 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.628342 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:01.629019 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.628969 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 17:48:01.629019 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.628971 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 17:48:01.629019 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629008 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 17:48:01.629233 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.629083 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 17:48:01.629233 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629137 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 20 17:48:01.629233 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629147 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 20 17:48:01.629233 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.629232 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:01.629394 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629337 2581 factory.go:153] Registering CRI-O factory Apr 20 17:48:01.629394 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629352 2581 factory.go:223] Registration of the crio container factory successfully Apr 20 17:48:01.629485 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629404 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 17:48:01.629485 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629415 2581 factory.go:55] Registering systemd factory Apr 20 17:48:01.629485 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629424 2581 factory.go:223] Registration of the systemd container factory successfully Apr 20 17:48:01.629485 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629461 2581 factory.go:103] Registering Raw factory Apr 20 17:48:01.629485 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629476 2581 manager.go:1196] Started watching for new ooms in manager Apr 20 17:48:01.629947 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.629934 2581 manager.go:319] Starting recovery of all containers Apr 20 17:48:01.630316 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.630290 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 17:48:01.630427 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.630407 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 17:48:01.636048 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.635886 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xltk9" Apr 20 17:48:01.640365 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.640344 2581 manager.go:324] Recovery completed Apr 20 17:48:01.642912 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.642893 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xltk9" Apr 20 17:48:01.646562 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.646541 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.649338 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649322 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.649395 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649353 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.649395 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649368 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.649934 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649920 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 17:48:01.649934 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649931 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 17:48:01.650068 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.649974 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:01.653489 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.653474 2581 policy_none.go:49] "None policy: Start" Apr 20 17:48:01.653573 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.653493 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 17:48:01.653573 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.653507 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 20 17:48:01.696327 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.696306 2581 manager.go:341] "Starting Device Plugin manager" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.696352 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.696365 2581 server.go:85] "Starting device plugin registration server" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.696763 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.696783 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.696906 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.697000 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.697009 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.697576 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 17:48:01.708280 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.697616 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:01.759848 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.759813 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 17:48:01.760986 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.760970 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 17:48:01.761050 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.760997 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 17:48:01.761050 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.761015 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 17:48:01.761050 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.761021 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 17:48:01.761172 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.761053 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 17:48:01.767191 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.767124 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:01.797949 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.797924 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.799109 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.799077 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.799109 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.799113 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.799241 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.799126 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.799241 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.799152 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.808230 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.808207 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.808230 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.808234 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-9.ec2.internal\": node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:01.824231 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.824207 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:01.861702 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.861655 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal"] Apr 20 17:48:01.861845 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.861746 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.862674 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.862656 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.862777 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.862688 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.862777 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.862705 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.865262 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.865248 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.865425 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.865411 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.865467 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.865440 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.866000 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.865983 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.866096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.866007 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.866096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.866021 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.866096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.866078 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.866205 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.866109 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.866205 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.866125 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.868399 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.868384 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.868454 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.868410 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:01.869177 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.869161 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:01.869237 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.869189 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:01.869237 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:01.869211 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:01.897636 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.897602 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-9.ec2.internal\" not found" node="ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.900944 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.900927 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-9.ec2.internal\" not found" node="ip-10-0-138-9.ec2.internal" Apr 20 17:48:01.924442 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:01.924415 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.024795 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.024704 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.030057 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.030036 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67dd5e3ce0ce81437a1036d614e3ee5e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-9.ec2.internal\" (UID: \"67dd5e3ce0ce81437a1036d614e3ee5e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.030120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.030066 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.030120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.030092 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.125341 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.125298 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.130583 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.130652 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130595 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.130652 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130613 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67dd5e3ce0ce81437a1036d614e3ee5e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-9.ec2.internal\" (UID: \"67dd5e3ce0ce81437a1036d614e3ee5e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.130717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130670 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.130717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130671 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5a0805b170e56dce7c8063947d7841f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal\" (UID: \"d5a0805b170e56dce7c8063947d7841f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.130717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.130695 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/67dd5e3ce0ce81437a1036d614e3ee5e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-9.ec2.internal\" (UID: \"67dd5e3ce0ce81437a1036d614e3ee5e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.199720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.199691 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.204597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.204574 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.225606 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.225569 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.326111 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.326006 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.426503 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.426466 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.527085 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.527045 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.538525 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.538503 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 17:48:02.538735 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.538710 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:02.627196 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:02.627106 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-9.ec2.internal\" not found" Apr 20 17:48:02.628843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.628819 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:02.632022 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.631995 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:02.641308 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.641288 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:02.645438 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.645411 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 17:43:01 +0000 UTC" deadline="2027-12-01 14:36:11.698568788 +0000 UTC" Apr 20 17:48:02.645496 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.645439 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14156h48m9.053133001s" Apr 20 17:48:02.661634 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.661597 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4kkgb" Apr 20 17:48:02.669630 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.669611 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4kkgb" Apr 20 17:48:02.702512 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.702483 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:02.728740 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.728713 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.740733 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.740712 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:02.741673 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.741650 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" Apr 20 17:48:02.749963 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.749940 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:02.835989 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.835963 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:02.863694 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:02.863645 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a0805b170e56dce7c8063947d7841f.slice/crio-4c0692777ea3646298fca9612071bbb2028d9012f5c92b350249a1fda3b77828 WatchSource:0}: Error finding container 4c0692777ea3646298fca9612071bbb2028d9012f5c92b350249a1fda3b77828: Status 404 returned error can't find the container with id 4c0692777ea3646298fca9612071bbb2028d9012f5c92b350249a1fda3b77828 Apr 20 17:48:02.864086 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:02.864052 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67dd5e3ce0ce81437a1036d614e3ee5e.slice/crio-ec6eb46a5a9300d1155daca33093b339859c8beec0e04f2b1825b0fee1796c46 WatchSource:0}: Error finding container ec6eb46a5a9300d1155daca33093b339859c8beec0e04f2b1825b0fee1796c46: Status 404 returned error can't find the container with id ec6eb46a5a9300d1155daca33093b339859c8beec0e04f2b1825b0fee1796c46 Apr 20 17:48:02.867585 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:02.867570 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:48:03.540893 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.540722 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:03.606065 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.606033 2581 apiserver.go:52] "Watching apiserver" Apr 20 17:48:03.621018 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.620988 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 17:48:03.622435 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.622386 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9xc6w","openshift-multus/network-metrics-daemon-7gff7","openshift-network-diagnostics/network-check-target-5228n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb","openshift-cluster-node-tuning-operator/tuned-7hcs9","openshift-image-registry/node-ca-g8b7h","openshift-multus/multus-v8gth","openshift-network-operator/iptables-alerter-jcj94","openshift-ovn-kubernetes/ovnkube-node-lbls6","kube-system/konnectivity-agent-r8qd7","kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal","openshift-dns/node-resolver-n9gmk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal"] Apr 20 17:48:03.627331 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.627300 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.629843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.629471 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.629843 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.629611 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:03.630192 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.630165 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.630482 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.630446 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jmtxs\"" Apr 20 17:48:03.630597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.630487 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 17:48:03.630597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.630564 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 17:48:03.630597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.630515 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.631919 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.631898 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.634394 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s8pqs\"" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.634435 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.634462 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.634396 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.634554 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:03.634611 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.634585 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.637258 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.636698 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.637258 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.636884 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.638808 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638790 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.638894 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638819 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-netns\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.638894 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638850 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-multus\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.638894 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-os-release\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638907 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-hostroot\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638950 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-daemon-config\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.638989 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwpb\" (UniqueName: \"kubernetes.io/projected/70ebccba-9caf-4e18-b7c2-430622fd3b07-kube-api-access-wdwpb\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639021 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.639052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639048 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-registration-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639085 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-multus-certs\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639144 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639171 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-cnibin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639208 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-k8s-cni-cncf-io\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-sys-fs\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639272 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dgw2g\"" Apr 20 17:48:03.639292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639291 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7z7d\" (UniqueName: \"kubernetes.io/projected/502aa551-1a31-4b43-b111-f4090d9c5028-kube-api-access-d7z7d\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639352 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-system-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639381 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639404 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-bin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639429 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-kubelet\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n7dd\" (UniqueName: \"kubernetes.io/projected/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-kube-api-access-2n7dd\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639480 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639553 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-socket-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639582 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-socket-dir-parent\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639672 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cnhpv\"" Apr 20 17:48:03.639720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639717 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-conf-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639734 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639743 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-etc-kubernetes\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639797 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-device-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639813 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-cni-binary-copy\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639814 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 17:48:03.640178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.639898 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 17:48:03.641452 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.641434 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.641907 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.641771 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jd47w\"" Apr 20 17:48:03.641907 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.641888 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.642046 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.641959 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.642197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.642180 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 17:48:03.644308 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.644290 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.644395 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.644345 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.644598 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.644552 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 17:48:03.644729 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.644676 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r8fd2\"" Apr 20 17:48:03.644729 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.644725 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.647007 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.646989 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.647618 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.647597 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-77kwv\"" Apr 20 17:48:03.647720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.647662 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 17:48:03.647720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.647702 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 17:48:03.647838 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.647819 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.647891 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.647850 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 17:48:03.648034 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.648017 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.648094 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.648062 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 17:48:03.650470 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.649985 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.650470 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.650010 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 17:48:03.650470 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.650058 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 17:48:03.650680 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.650593 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wt47c\"" Apr 20 17:48:03.653116 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.653055 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 17:48:03.653355 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.653336 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8fp5d\"" Apr 20 17:48:03.654237 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.654182 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 17:48:03.672084 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.672052 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:02 +0000 UTC" deadline="2028-01-15 05:29:20.842594438 +0000 UTC" Apr 20 17:48:03.672084 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.672081 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15227h41m17.170516539s" Apr 20 17:48:03.730404 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.730378 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 17:48:03.740680 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740649 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-netd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.740862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740690 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-env-overrides\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.740862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740746 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d36b98c1-fc6f-4438-84d5-25382aad1dc6-konnectivity-ca\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.740862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740787 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.740862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740815 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-modprobe-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.740862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740849 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-cnibin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-k8s-cni-cncf-io\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740927 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-cnibin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740933 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8cr\" (UniqueName: \"kubernetes.io/projected/1a64b0a7-defc-4ef8-b833-3e4b069784b3-kube-api-access-dj8cr\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740956 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-host\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740976 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.740960 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-k8s-cni-cncf-io\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7z7d\" (UniqueName: \"kubernetes.io/projected/502aa551-1a31-4b43-b111-f4090d9c5028-kube-api-access-d7z7d\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741042 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-system-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741057 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-os-release\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.741120 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741116 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj426\" (UniqueName: \"kubernetes.io/projected/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-kube-api-access-dj426\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741149 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741174 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-system-cni-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741197 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cnibin\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741223 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741247 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-host-slash\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741259 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-system-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741275 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-etc-kubernetes\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741302 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a64b0a7-defc-4ef8-b833-3e4b069784b3-host\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741338 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-var-lib-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741372 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-device-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741376 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-etc-kubernetes\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741398 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-cni-binary-copy\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741426 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-multus\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741454 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9h8p\" (UniqueName: \"kubernetes.io/projected/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-kube-api-access-h9h8p\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741477 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-device-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741480 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-script-lib\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.741645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741516 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-multus\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741515 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d36b98c1-fc6f-4438-84d5-25382aad1dc6-agent-certs\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-hostroot\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-daemon-config\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741576 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-kubernetes\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741593 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-lib-modules\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741618 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-hostroot\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741671 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741697 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-tmp-dir\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741738 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysconfig\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741762 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-sys\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741790 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-slash\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741816 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-config\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741843 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-registration-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-multus-certs\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741894 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-var-lib-kubelet\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.742410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-systemd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.741993 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-hosts-file\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742018 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbphl\" (UniqueName: \"kubernetes.io/projected/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-kube-api-access-hbphl\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742058 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742084 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-tmp\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742108 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-kubelet\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742143 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-systemd-units\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742173 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pjt6\" (UniqueName: \"kubernetes.io/projected/a41b8e52-ae34-439f-84de-ee703e85e441-kube-api-access-8pjt6\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742215 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-systemd\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742244 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-sys-fs\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-cni-binary-copy\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742284 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742211 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-daemon-config\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742364 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-multus-certs\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742411 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742439 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-bin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742466 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-kubelet\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742493 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-etc-selinux\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n7dd\" (UniqueName: \"kubernetes.io/projected/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-kube-api-access-2n7dd\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742548 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742564 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-sys-fs\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742576 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-ovn\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-registration-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742699 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41b8e52-ae34-439f-84de-ee703e85e441-ovn-node-metrics-cert\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742765 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-cni-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742777 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-kubelet\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742808 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-socket-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742822 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-var-lib-cni-bin\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-socket-dir-parent\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742894 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-conf-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742898 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/502aa551-1a31-4b43-b111-f4090d9c5028-socket-dir\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742924 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-run\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-conf-dir\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.742979 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-multus-socket-dir-parent\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.743916 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743011 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-tuned\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743058 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-netns\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743096 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-host-run-netns\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743105 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-node-log\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743196 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-os-release\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743224 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwpb\" (UniqueName: \"kubernetes.io/projected/70ebccba-9caf-4e18-b7c2-430622fd3b07-kube-api-access-wdwpb\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70ebccba-9caf-4e18-b7c2-430622fd3b07-os-release\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743295 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-iptables-alerter-script\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743409 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a64b0a7-defc-4ef8-b833-3e4b069784b3-serviceca\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743435 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-conf\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743461 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-etc-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743496 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-log-socket\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743519 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-bin\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743636 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsfqm\" (UniqueName: \"kubernetes.io/projected/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-kube-api-access-hsfqm\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.744659 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.743653 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:03.745425 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.743695 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-netns\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.745425 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.743777 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:04.24374644 +0000 UTC m=+3.019277213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:03.749482 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.749444 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 17:48:03.750171 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.749885 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:03.750171 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.749908 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:03.750171 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.749922 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:03.750171 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:03.749997 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:04.249977764 +0000 UTC m=+3.025508550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:03.753914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.753889 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7z7d\" (UniqueName: \"kubernetes.io/projected/502aa551-1a31-4b43-b111-f4090d9c5028-kube-api-access-d7z7d\") pod \"aws-ebs-csi-driver-node-gwnrb\" (UID: \"502aa551-1a31-4b43-b111-f4090d9c5028\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.753914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.753903 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n7dd\" (UniqueName: \"kubernetes.io/projected/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-kube-api-access-2n7dd\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:03.753914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.753888 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwpb\" (UniqueName: \"kubernetes.io/projected/70ebccba-9caf-4e18-b7c2-430622fd3b07-kube-api-access-wdwpb\") pod \"multus-v8gth\" (UID: \"70ebccba-9caf-4e18-b7c2-430622fd3b07\") " pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.765560 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.765509 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" event={"ID":"67dd5e3ce0ce81437a1036d614e3ee5e","Type":"ContainerStarted","Data":"ec6eb46a5a9300d1155daca33093b339859c8beec0e04f2b1825b0fee1796c46"} Apr 20 17:48:03.766666 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.766599 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" event={"ID":"d5a0805b170e56dce7c8063947d7841f","Type":"ContainerStarted","Data":"4c0692777ea3646298fca9612071bbb2028d9012f5c92b350249a1fda3b77828"} Apr 20 17:48:03.844654 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-ovn\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844654 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41b8e52-ae34-439f-84de-ee703e85e441-ovn-node-metrics-cert\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844654 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844617 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-run\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.844654 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844644 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-ovn\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-tuned\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844698 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844723 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-node-log\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844748 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844758 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-run\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844778 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-iptables-alerter-script\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844813 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-node-log\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a64b0a7-defc-4ef8-b833-3e4b069784b3-serviceca\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844871 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-conf\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-etc-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844922 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-log-socket\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.844969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.844946 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-bin\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845022 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845106 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsfqm\" (UniqueName: \"kubernetes.io/projected/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-kube-api-access-hsfqm\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845163 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-netns\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-netd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845213 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-env-overrides\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845238 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d36b98c1-fc6f-4438-84d5-25382aad1dc6-konnectivity-ca\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845264 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845290 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-modprobe-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8cr\" (UniqueName: \"kubernetes.io/projected/1a64b0a7-defc-4ef8-b833-3e4b069784b3-kube-api-access-dj8cr\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845347 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-host\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845374 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-os-release\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845400 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj426\" (UniqueName: \"kubernetes.io/projected/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-kube-api-access-dj426\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845423 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845449 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-system-cni-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cnibin\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.845508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845507 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845521 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-conf\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845532 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-host-slash\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845575 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a64b0a7-defc-4ef8-b833-3e4b069784b3-host\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845579 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-host-slash\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845604 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-var-lib-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845641 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-bin\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845661 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9h8p\" (UniqueName: \"kubernetes.io/projected/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-kube-api-access-h9h8p\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845688 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-script-lib\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845714 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d36b98c1-fc6f-4438-84d5-25382aad1dc6-agent-certs\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845742 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-kubernetes\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845830 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-lib-modules\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845862 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845888 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845908 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-netns\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845918 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-tmp-dir\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysconfig\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845952 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-cni-netd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.846250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845972 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-sys\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845991 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a64b0a7-defc-4ef8-b833-3e4b069784b3-serviceca\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845999 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-slash\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846030 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-config\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846061 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-var-lib-kubelet\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-systemd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846116 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-hosts-file\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846141 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbphl\" (UniqueName: \"kubernetes.io/projected/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-kube-api-access-hbphl\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846167 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-tmp\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846195 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-kubelet\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846220 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-systemd-units\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pjt6\" (UniqueName: \"kubernetes.io/projected/a41b8e52-ae34-439f-84de-ee703e85e441-kube-api-access-8pjt6\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846271 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-systemd\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846300 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-env-overrides\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysconfig\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846597 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-etc-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846674 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-os-release\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.845314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846749 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d36b98c1-fc6f-4438-84d5-25382aad1dc6-konnectivity-ca\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846783 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-host\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846919 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-sys\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.846957 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-system-cni-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847000 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cnibin\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847055 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-sysctl-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847101 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-slash\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847129 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847173 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847178 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847202 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-kubernetes\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-run-ovn-kubernetes\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847278 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-lib-modules\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847330 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-log-socket\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.847913 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847344 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-var-lib-kubelet\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847365 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-var-lib-openvswitch\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847387 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-run-systemd\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847402 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a64b0a7-defc-4ef8-b833-3e4b069784b3-host\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847420 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-tmp-dir\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847491 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-hosts-file\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847510 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-modprobe-d\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-systemd-units\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847565 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-config\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847579 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41b8e52-ae34-439f-84de-ee703e85e441-host-kubelet\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847740 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-systemd\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847827 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-etc-tuned\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.847939 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41b8e52-ae34-439f-84de-ee703e85e441-ovn-node-metrics-cert\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.848197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41b8e52-ae34-439f-84de-ee703e85e441-ovnkube-script-lib\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.848596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.848351 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-iptables-alerter-script\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.849605 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.849586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-tmp\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.850366 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.850346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d36b98c1-fc6f-4438-84d5-25382aad1dc6-agent-certs\") pod \"konnectivity-agent-r8qd7\" (UID: \"d36b98c1-fc6f-4438-84d5-25382aad1dc6\") " pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:03.854231 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.854184 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsfqm\" (UniqueName: \"kubernetes.io/projected/4463b9d9-8fcb-46f7-aa92-63bc974f8de4-kube-api-access-hsfqm\") pod \"iptables-alerter-jcj94\" (UID: \"4463b9d9-8fcb-46f7-aa92-63bc974f8de4\") " pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.854783 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.854740 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8cr\" (UniqueName: \"kubernetes.io/projected/1a64b0a7-defc-4ef8-b833-3e4b069784b3-kube-api-access-dj8cr\") pod \"node-ca-g8b7h\" (UID: \"1a64b0a7-defc-4ef8-b833-3e4b069784b3\") " pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.855024 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.854980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbphl\" (UniqueName: \"kubernetes.io/projected/a1c5a7fc-92d7-49a4-85c8-128fe8e46b19-kube-api-access-hbphl\") pod \"node-resolver-n9gmk\" (UID: \"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19\") " pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:03.855386 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.855357 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9h8p\" (UniqueName: \"kubernetes.io/projected/3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19-kube-api-access-h9h8p\") pod \"tuned-7hcs9\" (UID: \"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19\") " pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.855708 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.855685 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj426\" (UniqueName: \"kubernetes.io/projected/6667bb3e-c213-4fb7-a2f0-bb9a65372bf3-kube-api-access-dj426\") pod \"multus-additional-cni-plugins-9xc6w\" (UID: \"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3\") " pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.856763 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.856743 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pjt6\" (UniqueName: \"kubernetes.io/projected/a41b8e52-ae34-439f-84de-ee703e85e441-kube-api-access-8pjt6\") pod \"ovnkube-node-lbls6\" (UID: \"a41b8e52-ae34-439f-84de-ee703e85e441\") " pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.939863 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.939827 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v8gth" Apr 20 17:48:03.948827 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.948803 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" Apr 20 17:48:03.958799 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.958772 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" Apr 20 17:48:03.965401 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.965372 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" Apr 20 17:48:03.975095 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.975072 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jcj94" Apr 20 17:48:03.981708 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.981687 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8b7h" Apr 20 17:48:03.990381 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.990357 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:03.997019 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:03.996987 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:04.003596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.003571 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n9gmk" Apr 20 17:48:04.248741 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.248654 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:04.248887 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.248798 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:04.248887 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.248870 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:05.248851125 +0000 UTC m=+4.024381918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:04.349171 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.349137 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:04.349316 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.349300 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:04.349354 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.349322 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:04.349354 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.349332 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:04.349415 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.349385 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:05.349369537 +0000 UTC m=+4.124900309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:04.570852 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.570813 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe8f78d_28c5_41f6_963a_4f1cd1a7cf19.slice/crio-646db08ed793a917d0910224396625787b0dd8d90ebd4752d6d02ec43cf81f54 WatchSource:0}: Error finding container 646db08ed793a917d0910224396625787b0dd8d90ebd4752d6d02ec43cf81f54: Status 404 returned error can't find the container with id 646db08ed793a917d0910224396625787b0dd8d90ebd4752d6d02ec43cf81f54 Apr 20 17:48:04.574048 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.574026 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda41b8e52_ae34_439f_84de_ee703e85e441.slice/crio-ba694f68500b7e8e26a61f8da08b1fef3c74e5bcd214e4e7b410243594d2ac8d WatchSource:0}: Error finding container ba694f68500b7e8e26a61f8da08b1fef3c74e5bcd214e4e7b410243594d2ac8d: Status 404 returned error can't find the container with id ba694f68500b7e8e26a61f8da08b1fef3c74e5bcd214e4e7b410243594d2ac8d Apr 20 17:48:04.576778 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.576753 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4463b9d9_8fcb_46f7_aa92_63bc974f8de4.slice/crio-b5a3f17bcd68ccda4972182e59598f7b7199465ac30b86c186e70705a0d9752b WatchSource:0}: Error finding container b5a3f17bcd68ccda4972182e59598f7b7199465ac30b86c186e70705a0d9752b: Status 404 returned error can't find the container with id b5a3f17bcd68ccda4972182e59598f7b7199465ac30b86c186e70705a0d9752b Apr 20 17:48:04.597723 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.597696 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36b98c1_fc6f_4438_84d5_25382aad1dc6.slice/crio-a7ec078f5699c0e11e20c5f5df51172555981a13fb2f1aa830bc8b2e373b76a0 WatchSource:0}: Error finding container a7ec078f5699c0e11e20c5f5df51172555981a13fb2f1aa830bc8b2e373b76a0: Status 404 returned error can't find the container with id a7ec078f5699c0e11e20c5f5df51172555981a13fb2f1aa830bc8b2e373b76a0 Apr 20 17:48:04.598875 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.598834 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a64b0a7_defc_4ef8_b833_3e4b069784b3.slice/crio-64d35a31bffc7f2cc061acae846fe9a0a4613bc92ac44de018f8e27edd5401e5 WatchSource:0}: Error finding container 64d35a31bffc7f2cc061acae846fe9a0a4613bc92ac44de018f8e27edd5401e5: Status 404 returned error can't find the container with id 64d35a31bffc7f2cc061acae846fe9a0a4613bc92ac44de018f8e27edd5401e5 Apr 20 17:48:04.600403 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.600272 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod502aa551_1a31_4b43_b111_f4090d9c5028.slice/crio-da9011995f334e61b986aa964882296d9723895c9d1e54964471a359f794abcc WatchSource:0}: Error finding container da9011995f334e61b986aa964882296d9723895c9d1e54964471a359f794abcc: Status 404 returned error can't find the container with id da9011995f334e61b986aa964882296d9723895c9d1e54964471a359f794abcc Apr 20 17:48:04.600985 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.600957 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ebccba_9caf_4e18_b7c2_430622fd3b07.slice/crio-7caf88b2574190941b6bab24db877a1bf4f12c74d5085a8364446acfc1c18b3d WatchSource:0}: Error finding container 7caf88b2574190941b6bab24db877a1bf4f12c74d5085a8364446acfc1c18b3d: Status 404 returned error can't find the container with id 7caf88b2574190941b6bab24db877a1bf4f12c74d5085a8364446acfc1c18b3d Apr 20 17:48:04.601917 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.601894 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6667bb3e_c213_4fb7_a2f0_bb9a65372bf3.slice/crio-47df9a997bb74c16dd55544dbcbe61d7d70ca8be92bef4cca1cc3c3ea08c2f86 WatchSource:0}: Error finding container 47df9a997bb74c16dd55544dbcbe61d7d70ca8be92bef4cca1cc3c3ea08c2f86: Status 404 returned error can't find the container with id 47df9a997bb74c16dd55544dbcbe61d7d70ca8be92bef4cca1cc3c3ea08c2f86 Apr 20 17:48:04.602448 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:04.602425 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c5a7fc_92d7_49a4_85c8_128fe8e46b19.slice/crio-4a96fd267e02b4c60626e32d9fb260ea9cb51d7ab123f37254bf2191dc621754 WatchSource:0}: Error finding container 4a96fd267e02b4c60626e32d9fb260ea9cb51d7ab123f37254bf2191dc621754: Status 404 returned error can't find the container with id 4a96fd267e02b4c60626e32d9fb260ea9cb51d7ab123f37254bf2191dc621754 Apr 20 17:48:04.673418 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.673250 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:02 +0000 UTC" deadline="2027-11-22 22:19:22.586407166 +0000 UTC" Apr 20 17:48:04.673418 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.673411 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13948h31m17.912998793s" Apr 20 17:48:04.761718 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.761678 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:04.761901 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:04.761870 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:04.769519 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.769473 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" event={"ID":"502aa551-1a31-4b43-b111-f4090d9c5028","Type":"ContainerStarted","Data":"da9011995f334e61b986aa964882296d9723895c9d1e54964471a359f794abcc"} Apr 20 17:48:04.770504 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.770480 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8b7h" event={"ID":"1a64b0a7-defc-4ef8-b833-3e4b069784b3","Type":"ContainerStarted","Data":"64d35a31bffc7f2cc061acae846fe9a0a4613bc92ac44de018f8e27edd5401e5"} Apr 20 17:48:04.771454 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.771430 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r8qd7" event={"ID":"d36b98c1-fc6f-4438-84d5-25382aad1dc6","Type":"ContainerStarted","Data":"a7ec078f5699c0e11e20c5f5df51172555981a13fb2f1aa830bc8b2e373b76a0"} Apr 20 17:48:04.772923 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.772894 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" event={"ID":"67dd5e3ce0ce81437a1036d614e3ee5e","Type":"ContainerStarted","Data":"226d87cc3ee61fa1b34df23fd6b3ab4144282aca6214ef4e88c8f239b90af110"} Apr 20 17:48:04.773946 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.773928 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v8gth" event={"ID":"70ebccba-9caf-4e18-b7c2-430622fd3b07","Type":"ContainerStarted","Data":"7caf88b2574190941b6bab24db877a1bf4f12c74d5085a8364446acfc1c18b3d"} Apr 20 17:48:04.774910 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.774886 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jcj94" event={"ID":"4463b9d9-8fcb-46f7-aa92-63bc974f8de4","Type":"ContainerStarted","Data":"b5a3f17bcd68ccda4972182e59598f7b7199465ac30b86c186e70705a0d9752b"} Apr 20 17:48:04.775797 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.775774 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"ba694f68500b7e8e26a61f8da08b1fef3c74e5bcd214e4e7b410243594d2ac8d"} Apr 20 17:48:04.778106 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.778084 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" event={"ID":"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19","Type":"ContainerStarted","Data":"646db08ed793a917d0910224396625787b0dd8d90ebd4752d6d02ec43cf81f54"} Apr 20 17:48:04.779563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.779541 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n9gmk" event={"ID":"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19","Type":"ContainerStarted","Data":"4a96fd267e02b4c60626e32d9fb260ea9cb51d7ab123f37254bf2191dc621754"} Apr 20 17:48:04.780398 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.780380 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerStarted","Data":"47df9a997bb74c16dd55544dbcbe61d7d70ca8be92bef4cca1cc3c3ea08c2f86"} Apr 20 17:48:04.787585 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:04.787542 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-9.ec2.internal" podStartSLOduration=2.787531372 podStartE2EDuration="2.787531372s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:04.787247763 +0000 UTC m=+3.562778558" watchObservedRunningTime="2026-04-20 17:48:04.787531372 +0000 UTC m=+3.563062166" Apr 20 17:48:05.257671 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:05.257536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:05.257889 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.257792 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:05.257889 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.257863 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:07.257844866 +0000 UTC m=+6.033375641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:05.359054 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:05.358649 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:05.359054 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.358849 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:05.359054 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.358868 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:05.359054 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.358881 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:05.359054 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.358939 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:07.358920656 +0000 UTC m=+6.134451446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:05.763686 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:05.763651 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:05.764121 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:05.763779 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:05.799862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:05.799775 2581 generic.go:358] "Generic (PLEG): container finished" podID="d5a0805b170e56dce7c8063947d7841f" containerID="9c9f978ddccc6f1db7a7d222c4d66cd5b79baed32a2cc738822804dec2d71871" exitCode=0 Apr 20 17:48:05.800715 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:05.800679 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" event={"ID":"d5a0805b170e56dce7c8063947d7841f","Type":"ContainerDied","Data":"9c9f978ddccc6f1db7a7d222c4d66cd5b79baed32a2cc738822804dec2d71871"} Apr 20 17:48:06.761826 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:06.761791 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:06.762002 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:06.761950 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:06.814467 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:06.813614 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" event={"ID":"d5a0805b170e56dce7c8063947d7841f","Type":"ContainerStarted","Data":"66afa4e75390a08a063610f1504c2cb52060bedf38db069bcf1c6c3582dc7cb0"} Apr 20 17:48:06.827158 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:06.827102 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-9.ec2.internal" podStartSLOduration=4.82708237 podStartE2EDuration="4.82708237s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:06.826921896 +0000 UTC m=+5.602452691" watchObservedRunningTime="2026-04-20 17:48:06.82708237 +0000 UTC m=+5.602613166" Apr 20 17:48:07.276230 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:07.276174 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:07.276421 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.276342 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:07.276421 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.276407 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:11.276390196 +0000 UTC m=+10.051920990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:07.378124 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:07.377404 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:07.378124 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.377650 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:07.378124 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.377671 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:07.378124 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.377685 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:07.378124 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.377745 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:11.377725879 +0000 UTC m=+10.153256665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:07.761864 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:07.761793 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:07.762005 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:07.761953 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:08.761786 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:08.761254 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:08.761786 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:08.761403 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:09.762311 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:09.761932 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:09.762311 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:09.762070 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:10.761294 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:10.761257 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:10.761499 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:10.761408 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:11.312847 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:11.312258 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:11.312847 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.312421 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:11.312847 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.312485 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:19.312465882 +0000 UTC m=+18.087996671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:11.413339 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:11.413299 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:11.413530 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.413460 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:11.413530 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.413487 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:11.413530 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.413502 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:11.413730 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.413591 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:19.413570807 +0000 UTC m=+18.189101609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:11.762842 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:11.762752 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:11.762992 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:11.762874 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:12.761917 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:12.761881 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:12.762365 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:12.762024 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:13.762013 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:13.761477 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:13.762013 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:13.761617 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:14.761260 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:14.761225 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:14.761547 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:14.761360 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:15.761491 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:15.761455 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:15.761980 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:15.761586 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:16.761234 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:16.761196 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:16.761434 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:16.761345 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:17.761292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:17.761258 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:17.761769 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:17.761384 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:18.761993 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:18.761955 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:18.762431 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:18.762069 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:18.922871 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:18.922827 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xrxtp"] Apr 20 17:48:18.933678 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:18.933642 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:18.933843 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:18.933728 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:19.075884 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.075791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-dbus\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.075884 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.075860 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-kubelet-config\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.076133 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.075897 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176243 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.176207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-dbus\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176401 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.176274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-kubelet-config\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176401 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.176317 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176401 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.176371 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-kubelet-config\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176595 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.176430 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9041de32-fafa-4935-a258-8c6ecce98d75-dbus\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.176595 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.176461 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:19.176595 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.176550 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:19.676527245 +0000 UTC m=+18.452058023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:19.378159 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.378059 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:19.378337 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.378256 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:19.378337 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.378333 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:48:35.378312484 +0000 UTC m=+34.153843260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:19.479377 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.479336 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:19.479542 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.479502 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:19.479542 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.479524 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:19.479542 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.479536 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:19.479722 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.479589 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:35.479572737 +0000 UTC m=+34.255103521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:19.680950 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.680898 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:19.681136 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.681087 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:19.681191 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.681175 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:20.681153425 +0000 UTC m=+19.456684198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:19.761477 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:19.761446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:19.761659 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:19.761585 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:20.687325 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:20.687293 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:20.687747 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:20.687446 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:20.687747 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:20.687518 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:22.687503191 +0000 UTC m=+21.463033963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:20.761676 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:20.761639 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:20.761862 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:20.761682 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:20.761862 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:20.761772 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:20.761982 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:20.761925 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:21.762241 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:21.762204 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:21.762713 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:21.762329 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:22.704254 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.703904 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:22.704412 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:22.704053 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:22.704412 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:22.704324 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:26.704308647 +0000 UTC m=+25.479839420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:22.761448 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.761416 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:22.761603 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.761420 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:22.761603 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:22.761571 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:22.761730 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:22.761617 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:22.843929 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.843887 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v8gth" event={"ID":"70ebccba-9caf-4e18-b7c2-430622fd3b07","Type":"ContainerStarted","Data":"b45aebce9dc1a5cf4dbc2211446b29f55eecbe0691d17cd1ca3c1b6b0580dc0e"} Apr 20 17:48:22.846905 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846841 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"7e887d9362c70fadd76688997e2f0316cc564b6c8d669d00754eb463361a3261"} Apr 20 17:48:22.846905 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846876 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"dc9f9510748ef633316647e5f19e241358949171aafed5c132eae39d44e2f87a"} Apr 20 17:48:22.846905 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846890 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"58e8c572f17054c7eafd478386bbb679742da4252270ed8dbfc382a4c7710fef"} Apr 20 17:48:22.846905 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846903 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"23b35efb3b2707d2350e1f3cc9c416f1a54d601d9e49cf3a0a061ec40013ef62"} Apr 20 17:48:22.847211 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846914 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"d1ca9506b9f42836d5ce66e590866994bbca7b732c235502b9c6458a7abc5af9"} Apr 20 17:48:22.847211 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.846927 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"9cf6df0ccee4311b68399edff44d11214a6e8dc8428989a1c2c62cd68411699a"} Apr 20 17:48:22.848320 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.848285 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" event={"ID":"3fe8f78d-28c5-41f6-963a-4f1cd1a7cf19","Type":"ContainerStarted","Data":"e07f60dc9f65740306c01b7e80ee7343d3df7af55f71745de7eedc8823199f01"} Apr 20 17:48:22.849749 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.849715 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n9gmk" event={"ID":"a1c5a7fc-92d7-49a4-85c8-128fe8e46b19","Type":"ContainerStarted","Data":"a14b9eee66d7068dc9b9166621a6bb02ddf03c8cc05a86eeae3745fd35353eef"} Apr 20 17:48:22.851448 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.851423 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="6811baa87d1ac9534ce508f7bcb22511632c3b1a6a036a835bb8c2bcc85947aa" exitCode=0 Apr 20 17:48:22.851552 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.851508 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"6811baa87d1ac9534ce508f7bcb22511632c3b1a6a036a835bb8c2bcc85947aa"} Apr 20 17:48:22.852987 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.852969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" event={"ID":"502aa551-1a31-4b43-b111-f4090d9c5028","Type":"ContainerStarted","Data":"d39831a2c869a8e0ed8dc04fc9875613888ab44d81a71b950c324b20f0315de1"} Apr 20 17:48:22.854527 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.854442 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8b7h" event={"ID":"1a64b0a7-defc-4ef8-b833-3e4b069784b3","Type":"ContainerStarted","Data":"9f60f097c43ffab8a264f0fa72bc5a2ed728398bebf656325fde16d38297b8a7"} Apr 20 17:48:22.856033 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.856013 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r8qd7" event={"ID":"d36b98c1-fc6f-4438-84d5-25382aad1dc6","Type":"ContainerStarted","Data":"d75d3e9d30abadd4beaf211336c017d0293d56f90764076592a6cc1156850beb"} Apr 20 17:48:22.860795 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.860751 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v8gth" podStartSLOduration=4.609123502 podStartE2EDuration="21.860740969s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.60571241 +0000 UTC m=+3.381243183" lastFinishedPulling="2026-04-20 17:48:21.857329874 +0000 UTC m=+20.632860650" observedRunningTime="2026-04-20 17:48:22.860533854 +0000 UTC m=+21.636064650" watchObservedRunningTime="2026-04-20 17:48:22.860740969 +0000 UTC m=+21.636271773" Apr 20 17:48:22.873535 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.873497 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7hcs9" podStartSLOduration=4.590847991 podStartE2EDuration="21.873485487s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.574704304 +0000 UTC m=+3.350235077" lastFinishedPulling="2026-04-20 17:48:21.8573418 +0000 UTC m=+20.632872573" observedRunningTime="2026-04-20 17:48:22.872877594 +0000 UTC m=+21.648408389" watchObservedRunningTime="2026-04-20 17:48:22.873485487 +0000 UTC m=+21.649016281" Apr 20 17:48:22.925322 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.925262 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g8b7h" podStartSLOduration=11.852368797 podStartE2EDuration="20.925246529s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.600536081 +0000 UTC m=+3.376066857" lastFinishedPulling="2026-04-20 17:48:13.673413807 +0000 UTC m=+12.448944589" observedRunningTime="2026-04-20 17:48:22.905896403 +0000 UTC m=+21.681427196" watchObservedRunningTime="2026-04-20 17:48:22.925246529 +0000 UTC m=+21.700777324" Apr 20 17:48:22.925606 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.925582 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r8qd7" podStartSLOduration=3.706628125 podStartE2EDuration="20.925575073s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.599326654 +0000 UTC m=+3.374857446" lastFinishedPulling="2026-04-20 17:48:21.81827362 +0000 UTC m=+20.593804394" observedRunningTime="2026-04-20 17:48:22.925002847 +0000 UTC m=+21.700533641" watchObservedRunningTime="2026-04-20 17:48:22.925575073 +0000 UTC m=+21.701106213" Apr 20 17:48:22.943375 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:22.943331 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n9gmk" podStartSLOduration=3.694942669 podStartE2EDuration="20.943318346s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.605864871 +0000 UTC m=+3.381395659" lastFinishedPulling="2026-04-20 17:48:21.85424056 +0000 UTC m=+20.629771336" observedRunningTime="2026-04-20 17:48:22.942786165 +0000 UTC m=+21.718316959" watchObservedRunningTime="2026-04-20 17:48:22.943318346 +0000 UTC m=+21.718849141" Apr 20 17:48:23.451031 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.450901 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 17:48:23.708586 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.708400 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T17:48:23.450925014Z","UUID":"e73481e1-36ce-4099-b929-f288ec1d7522","Handler":null,"Name":"","Endpoint":""} Apr 20 17:48:23.711483 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.711461 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 17:48:23.711618 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.711495 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 17:48:23.761323 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.761286 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:23.761507 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:23.761423 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:23.855827 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.855769 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:23.860590 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.860552 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jcj94" event={"ID":"4463b9d9-8fcb-46f7-aa92-63bc974f8de4","Type":"ContainerStarted","Data":"b97c3b7ce7a58429af9909017a1202dc3b106bdb785ca55bfad8aab943dce040"} Apr 20 17:48:23.864053 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.863655 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" event={"ID":"502aa551-1a31-4b43-b111-f4090d9c5028","Type":"ContainerStarted","Data":"e12b0380a8757ce2cc3d9455b00e7d1afddca70c1606fc3d1bfc8eee33db540b"} Apr 20 17:48:23.876070 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:23.876013 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jcj94" podStartSLOduration=5.654258694 podStartE2EDuration="22.875995194s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.596596709 +0000 UTC m=+3.372127482" lastFinishedPulling="2026-04-20 17:48:21.818333206 +0000 UTC m=+20.593863982" observedRunningTime="2026-04-20 17:48:23.874927049 +0000 UTC m=+22.650457845" watchObservedRunningTime="2026-04-20 17:48:23.875995194 +0000 UTC m=+22.651525990" Apr 20 17:48:24.762135 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:24.762044 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:24.762304 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:24.762166 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:24.762304 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:24.762221 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:24.762397 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:24.762342 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:24.869286 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:24.869245 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"2fe5730afcd68ca8b5712b6cccc348556b99d95dd15fb6e54c01df77be241fc3"} Apr 20 17:48:24.871480 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:24.871435 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" event={"ID":"502aa551-1a31-4b43-b111-f4090d9c5028","Type":"ContainerStarted","Data":"fe128eae5c42289d46b2dbd7091e3316d172b17f7b169c0ef736a9d0ee57eb0b"} Apr 20 17:48:24.887997 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:24.887933 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gwnrb" podStartSLOduration=4.260937719 podStartE2EDuration="23.887914658s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.601873556 +0000 UTC m=+3.377404330" lastFinishedPulling="2026-04-20 17:48:24.228850496 +0000 UTC m=+23.004381269" observedRunningTime="2026-04-20 17:48:24.887640654 +0000 UTC m=+23.663171447" watchObservedRunningTime="2026-04-20 17:48:24.887914658 +0000 UTC m=+23.663445458" Apr 20 17:48:25.761989 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:25.761950 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:25.762193 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:25.762098 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:26.733305 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.733092 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:26.733770 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:26.733265 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:26.733770 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:26.733431 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:34.733406291 +0000 UTC m=+33.508937067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:26.754600 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.754569 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:26.755244 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.755223 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:26.761549 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.761518 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:26.761971 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:26.761943 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:26.762047 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.762010 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:26.762135 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:26.762114 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:26.875195 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:26.875118 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r8qd7" Apr 20 17:48:27.761237 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.761204 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:27.762080 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:27.761323 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:27.879197 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.879161 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" event={"ID":"a41b8e52-ae34-439f-84de-ee703e85e441","Type":"ContainerStarted","Data":"cbd9527472a2ceace6f54f97c4bc9ddc532b059f6c37e39b09b2a5ec65f59b9c"} Apr 20 17:48:27.879556 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.879525 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:27.879724 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.879564 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:27.881301 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.881276 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="f2a5677747773c5fdba9e197126cb8d0a879f26972b50b9a22a41c7ba4a5a1c6" exitCode=0 Apr 20 17:48:27.881420 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.881304 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"f2a5677747773c5fdba9e197126cb8d0a879f26972b50b9a22a41c7ba4a5a1c6"} Apr 20 17:48:27.894388 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.894360 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:27.894880 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.894863 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:27.907351 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:27.907315 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" podStartSLOduration=8.59264274 podStartE2EDuration="25.907302743s" podCreationTimestamp="2026-04-20 17:48:02 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.596578123 +0000 UTC m=+3.372108896" lastFinishedPulling="2026-04-20 17:48:21.911238111 +0000 UTC m=+20.686768899" observedRunningTime="2026-04-20 17:48:27.90567073 +0000 UTC m=+26.681201526" watchObservedRunningTime="2026-04-20 17:48:27.907302743 +0000 UTC m=+26.682833536" Apr 20 17:48:28.636108 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.636072 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:28.762263 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.762235 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:28.762635 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.762235 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:28.762635 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:28.762337 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:28.762635 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:28.762404 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:28.896607 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.896507 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5228n"] Apr 20 17:48:28.896756 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.896645 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:28.896756 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:28.896728 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:28.899450 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.899421 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xrxtp"] Apr 20 17:48:28.899608 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.899514 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:28.899682 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:28.899604 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:28.900101 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.900078 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7gff7"] Apr 20 17:48:28.900204 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:28.900188 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:28.900335 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:28.900314 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:29.887306 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:29.887266 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="fdd2bd93ee949571dfa3b9caa0fbf4a2da42b32838d7913367fa89c624cc6b18" exitCode=0 Apr 20 17:48:29.887710 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:29.887348 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"fdd2bd93ee949571dfa3b9caa0fbf4a2da42b32838d7913367fa89c624cc6b18"} Apr 20 17:48:30.762025 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:30.761995 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:30.762161 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:30.761995 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:30.762161 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:30.762140 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:30.762257 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:30.761995 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:30.762257 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:30.762176 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:30.762341 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:30.762291 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:31.892936 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:31.892900 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="fbad84d9ba039adfcfe555d00b411108b424cc04c02e7cf81d4009baacec3459" exitCode=0 Apr 20 17:48:31.893388 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:31.892967 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"fbad84d9ba039adfcfe555d00b411108b424cc04c02e7cf81d4009baacec3459"} Apr 20 17:48:32.762210 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:32.762169 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:32.762210 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:32.762198 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:32.762421 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:32.762297 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:32.762421 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:32.762324 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:32.762511 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:32.762430 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:32.762574 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:32.762532 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:34.762021 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:34.761767 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:34.762482 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:34.761767 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:34.762482 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:34.762129 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:48:34.762482 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:34.761783 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:34.762482 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:34.762167 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5228n" podUID="1496881d-daad-4889-957b-ae0b63332278" Apr 20 17:48:34.762482 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:34.762286 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xrxtp" podUID="9041de32-fafa-4935-a258-8c6ecce98d75" Apr 20 17:48:34.797332 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:34.797235 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:34.797510 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:34.797368 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:34.797510 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:34.797437 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret podName:9041de32-fafa-4935-a258-8c6ecce98d75 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:50.79741879 +0000 UTC m=+49.572949564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret") pod "global-pull-secret-syncer-xrxtp" (UID: "9041de32-fafa-4935-a258-8c6ecce98d75") : object "kube-system"/"original-pull-secret" not registered Apr 20 17:48:35.086550 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.086423 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-9.ec2.internal" event="NodeReady" Apr 20 17:48:35.086734 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.086594 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 17:48:35.118795 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.118764 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-z92d2"] Apr 20 17:48:35.140683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.140436 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb"] Apr 20 17:48:35.140856 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.140725 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.143785 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.143437 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 17:48:35.143785 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.143518 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8tm8c\"" Apr 20 17:48:35.143785 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.143440 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 17:48:35.152244 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.152085 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb"] Apr 20 17:48:35.179247 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.179214 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c47d9bcc6-8sbg6"] Apr 20 17:48:35.179420 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.179355 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.179722 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.179695 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.182126 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182100 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 17:48:35.182229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182153 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 17:48:35.182229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182170 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 17:48:35.182229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182170 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 17:48:35.182450 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182295 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 17:48:35.182450 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.182355 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fcwlq\"" Apr 20 17:48:35.203371 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.203344 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d"] Apr 20 17:48:35.203546 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.203526 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.206185 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.206162 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 17:48:35.206185 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.206198 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 17:48:35.206413 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.206210 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 17:48:35.206413 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.206213 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gt955\"" Apr 20 17:48:35.213715 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.213694 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 17:48:35.226901 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.226878 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-z92d2"] Apr 20 17:48:35.226901 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.226905 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb"] Apr 20 17:48:35.227078 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.226920 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb"] Apr 20 17:48:35.227078 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.226930 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d"] Apr 20 17:48:35.227078 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.227036 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.227185 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.227039 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c47d9bcc6-8sbg6"] Apr 20 17:48:35.227235 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.227199 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p6j2d"] Apr 20 17:48:35.229595 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.229565 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 17:48:35.229743 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.229656 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 17:48:35.229743 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.229683 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 17:48:35.229870 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.229801 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 17:48:35.248495 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.248468 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tqj4h"] Apr 20 17:48:35.248664 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.248649 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.251115 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.251094 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 17:48:35.251229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.251136 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:48:35.251229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.251136 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 17:48:35.263473 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.263444 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6j2d"] Apr 20 17:48:35.263473 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.263476 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tqj4h"] Apr 20 17:48:35.263700 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.263584 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.266560 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.266527 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:48:35.266923 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.266905 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 17:48:35.266923 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.266920 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 17:48:35.267047 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.266910 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 17:48:35.300924 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.300888 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/488f7f9a-db93-4f84-b131-8e361a2591b8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.300924 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.300921 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.300945 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.300961 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301013 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301048 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c16e06c2-3dd3-4775-84f5-ec5d0058d281-tmp\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f82115f5-7c80-4334-9e2c-bf493509b8ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.301141 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301146 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58db5\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8qr\" (UniqueName: \"kubernetes.io/projected/488f7f9a-db93-4f84-b131-8e361a2591b8-kube-api-access-4v8qr\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8l56\" (UniqueName: \"kubernetes.io/projected/c16e06c2-3dd3-4775-84f5-ec5d0058d281-kube-api-access-v8l56\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301236 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301264 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c16e06c2-3dd3-4775-84f5-ec5d0058d281-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301297 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.301340 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.301320 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.401876 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.401846 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.401887 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-tmp-dir\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.401924 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58db5\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.401954 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8l56\" (UniqueName: \"kubernetes.io/projected/c16e06c2-3dd3-4775-84f5-ec5d0058d281-kube-api-access-v8l56\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.401986 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2qs5\" (UniqueName: \"kubernetes.io/projected/1d73585e-0790-49af-85fe-84de1111a4e8-kube-api-access-r2qs5\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402003 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78694\" (UniqueName: \"kubernetes.io/projected/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-kube-api-access-78694\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402022 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c16e06c2-3dd3-4775-84f5-ec5d0058d281-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402038 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.402085 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402058 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402090 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402116 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402146 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402173 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402203 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402227 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402254 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f82115f5-7c80-4334-9e2c-bf493509b8ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.402275 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.402293 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.402347 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:35.902326963 +0000 UTC m=+34.677857749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1d73585e-0790-49af-85fe-84de1111a4e8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8qr\" (UniqueName: \"kubernetes.io/projected/488f7f9a-db93-4f84-b131-8e361a2591b8-kube-api-access-4v8qr\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.402486 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402468 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402496 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402531 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402581 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402610 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/488f7f9a-db93-4f84-b131-8e361a2591b8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402653 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402711 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-config-volume\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402743 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhvs\" (UniqueName: \"kubernetes.io/projected/baa515c8-8724-40c1-a30b-d562783453d5-kube-api-access-9jhvs\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.402774 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c16e06c2-3dd3-4775-84f5-ec5d0058d281-tmp\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.403056 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f82115f5-7c80-4334-9e2c-bf493509b8ca-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.403117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.403079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.403140 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c16e06c2-3dd3-4775-84f5-ec5d0058d281-tmp\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.403151 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.403157 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.403211 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:48:35.903194155 +0000 UTC m=+34.678724932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.403220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.403523 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.403230 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.403220863 +0000 UTC m=+66.178751645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:35.403963 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.403918 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.407242 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.407216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.407242 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.407234 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.407414 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.407326 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c16e06c2-3dd3-4775-84f5-ec5d0058d281-klusterlet-config\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.407929 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.407907 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/488f7f9a-db93-4f84-b131-8e361a2591b8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.412683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.412599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8qr\" (UniqueName: \"kubernetes.io/projected/488f7f9a-db93-4f84-b131-8e361a2591b8-kube-api-access-4v8qr\") pod \"managed-serviceaccount-addon-agent-88bb7c64f-pl5kb\" (UID: \"488f7f9a-db93-4f84-b131-8e361a2591b8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.412861 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.412840 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.412919 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.412870 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8l56\" (UniqueName: \"kubernetes.io/projected/c16e06c2-3dd3-4775-84f5-ec5d0058d281-kube-api-access-v8l56\") pod \"klusterlet-addon-workmgr-6ddf7b59f4-s2xrb\" (UID: \"c16e06c2-3dd3-4775-84f5-ec5d0058d281\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.412919 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.412872 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58db5\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.491730 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.491699 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:35.503672 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.503834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503690 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.503834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503719 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.503834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503743 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1d73585e-0790-49af-85fe-84de1111a4e8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.503834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.503834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.503845 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503859 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-config-volume\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503885 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhvs\" (UniqueName: \"kubernetes.io/projected/baa515c8-8724-40c1-a30b-d562783453d5-kube-api-access-9jhvs\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.503934 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.003897029 +0000 UTC m=+34.779427805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.503986 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504020 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-tmp-dir\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504062 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2qs5\" (UniqueName: \"kubernetes.io/projected/1d73585e-0790-49af-85fe-84de1111a4e8-kube-api-access-r2qs5\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.504098 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504089 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78694\" (UniqueName: \"kubernetes.io/projected/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-kube-api-access-78694\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.504446 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.504446 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504315 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:35.504446 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504335 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:35.504446 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504349 2581 projected.go:194] Error preparing data for projected volume kube-api-access-tlnrn for pod openshift-network-diagnostics/network-check-target-5228n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:35.504446 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504397 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn podName:1496881d-daad-4889-957b-ae0b63332278 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.504381688 +0000 UTC m=+66.279912461 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tlnrn" (UniqueName: "kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn") pod "network-check-target-5228n" (UID: "1496881d-daad-4889-957b-ae0b63332278") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:35.504716 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504584 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-tmp-dir\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.504774 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504679 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:35.504824 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.504778 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.00475886 +0000 UTC m=+34.780289638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:35.505049 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.504928 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-config-volume\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.505049 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.505019 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1d73585e-0790-49af-85fe-84de1111a4e8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.506763 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.506713 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-ca\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.506763 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.506741 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.506763 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.506758 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" Apr 20 17:48:35.507136 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.507114 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.507136 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.507133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1d73585e-0790-49af-85fe-84de1111a4e8-hub\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.513443 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.513353 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhvs\" (UniqueName: \"kubernetes.io/projected/baa515c8-8724-40c1-a30b-d562783453d5-kube-api-access-9jhvs\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:35.513555 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.513447 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78694\" (UniqueName: \"kubernetes.io/projected/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-kube-api-access-78694\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:35.513824 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.513734 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2qs5\" (UniqueName: \"kubernetes.io/projected/1d73585e-0790-49af-85fe-84de1111a4e8-kube-api-access-r2qs5\") pod \"cluster-proxy-proxy-agent-86c6976b-b646d\" (UID: \"1d73585e-0790-49af-85fe-84de1111a4e8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.536351 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.536313 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:48:35.908234 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.908204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:35.908309 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.908386 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.908405 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.908422 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.908460 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.908444243 +0000 UTC m=+35.683975016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:35.908779 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:35.908477 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:48:36.908464951 +0000 UTC m=+35.683995723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:36.009083 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.009018 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:36.009274 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.009118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:36.009274 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.009204 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:36.009274 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.009204 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:36.009274 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.009257 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:37.00924353 +0000 UTC m=+35.784774303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:36.009274 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.009270 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:48:37.00926463 +0000 UTC m=+35.784795403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:36.761533 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.761492 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:36.761749 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.761491 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:48:36.761749 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.761502 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:48:36.768049 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.765082 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 17:48:36.768049 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.765171 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 17:48:36.768049 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.765244 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 17:48:36.768364 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.768336 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 17:48:36.768440 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.768402 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-swr7p\"" Apr 20 17:48:36.768500 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.768335 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:48:36.916818 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.916781 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:36.916907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.916952 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.916972 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.917025 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:38.917009136 +0000 UTC m=+37.692539909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.917029 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:36.917294 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:36.917092 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:48:38.917075989 +0000 UTC m=+37.692606764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:37.017733 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.017657 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:37.017892 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.017766 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:37.017892 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:37.017816 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:37.017892 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:37.017873 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:37.017892 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:37.017881 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:39.01786616 +0000 UTC m=+37.793396933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:37.018073 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:37.017926 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:48:39.017916285 +0000 UTC m=+37.793447058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:37.702878 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.702846 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d"] Apr 20 17:48:37.705666 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.705645 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb"] Apr 20 17:48:37.714250 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.714230 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb"] Apr 20 17:48:37.833383 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:37.833076 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d73585e_0790_49af_85fe_84de1111a4e8.slice/crio-5fdbc5bf1d09011e7f8431c31d0bc76c25133ad7498eec543fe90c3dd2c79e7e WatchSource:0}: Error finding container 5fdbc5bf1d09011e7f8431c31d0bc76c25133ad7498eec543fe90c3dd2c79e7e: Status 404 returned error can't find the container with id 5fdbc5bf1d09011e7f8431c31d0bc76c25133ad7498eec543fe90c3dd2c79e7e Apr 20 17:48:37.833772 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:37.833744 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod488f7f9a_db93_4f84_b131_8e361a2591b8.slice/crio-0910679ef2fdc240214e479d8e8bc84c052601f0446cd1d14b6b7a24298fea33 WatchSource:0}: Error finding container 0910679ef2fdc240214e479d8e8bc84c052601f0446cd1d14b6b7a24298fea33: Status 404 returned error can't find the container with id 0910679ef2fdc240214e479d8e8bc84c052601f0446cd1d14b6b7a24298fea33 Apr 20 17:48:37.834517 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:37.834430 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc16e06c2_3dd3_4775_84f5_ec5d0058d281.slice/crio-615f41a2e0076f66f08821a182e18af96134cbac2864b2b0b71834a27ef1f55b WatchSource:0}: Error finding container 615f41a2e0076f66f08821a182e18af96134cbac2864b2b0b71834a27ef1f55b: Status 404 returned error can't find the container with id 615f41a2e0076f66f08821a182e18af96134cbac2864b2b0b71834a27ef1f55b Apr 20 17:48:37.905724 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.905662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" event={"ID":"c16e06c2-3dd3-4775-84f5-ec5d0058d281","Type":"ContainerStarted","Data":"615f41a2e0076f66f08821a182e18af96134cbac2864b2b0b71834a27ef1f55b"} Apr 20 17:48:37.906803 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.906770 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerStarted","Data":"5fdbc5bf1d09011e7f8431c31d0bc76c25133ad7498eec543fe90c3dd2c79e7e"} Apr 20 17:48:37.907806 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:37.907784 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" event={"ID":"488f7f9a-db93-4f84-b131-8e361a2591b8","Type":"ContainerStarted","Data":"0910679ef2fdc240214e479d8e8bc84c052601f0446cd1d14b6b7a24298fea33"} Apr 20 17:48:38.915610 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:38.915567 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="0d2610dc348c1607b2d9c165f42504ba9ab72eba3647fee1097c51d2450ac278" exitCode=0 Apr 20 17:48:38.916459 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:38.915673 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"0d2610dc348c1607b2d9c165f42504ba9ab72eba3647fee1097c51d2450ac278"} Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:38.936152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:38.936269 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:38.936317 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:38.936339 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:38.936403 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:42.93637935 +0000 UTC m=+41.711910139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:38.936454 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:38.936541 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:38.936513 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:48:42.93649397 +0000 UTC m=+41.712024757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:39.037465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:39.037581 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:39.038221 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:39.038281 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:48:43.038261806 +0000 UTC m=+41.813792583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:39.038835 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:39.039416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:39.038881 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:43.03886568 +0000 UTC m=+41.814396456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:39.925227 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:39.925187 2581 generic.go:358] "Generic (PLEG): container finished" podID="6667bb3e-c213-4fb7-a2f0-bb9a65372bf3" containerID="49d6f7b3fb00ea93afd72c17a09b9d1f4735ff8a3fd1c004698af7803cbde6b8" exitCode=0 Apr 20 17:48:39.925877 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:39.925816 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerDied","Data":"49d6f7b3fb00ea93afd72c17a09b9d1f4735ff8a3fd1c004698af7803cbde6b8"} Apr 20 17:48:42.933492 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.933444 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerStarted","Data":"350383b4b02d18b04dcf8ded9b52fa31dbbac8987999ee3b09a1c22291b0507d"} Apr 20 17:48:42.936867 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.936835 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" event={"ID":"6667bb3e-c213-4fb7-a2f0-bb9a65372bf3","Type":"ContainerStarted","Data":"75acf4d081b243d50c9e5247e4ad663455173ef880255b733157f25b653c0db7"} Apr 20 17:48:42.938159 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.938124 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" event={"ID":"488f7f9a-db93-4f84-b131-8e361a2591b8","Type":"ContainerStarted","Data":"dd7c3a424e039552d760ba019eb34159dc961cbc82e4890a8e8904fec812c260"} Apr 20 17:48:42.959644 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.959580 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9xc6w" podStartSLOduration=8.693083073 podStartE2EDuration="41.959565288s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:48:04.60586104 +0000 UTC m=+3.381391819" lastFinishedPulling="2026-04-20 17:48:37.872343261 +0000 UTC m=+36.647874034" observedRunningTime="2026-04-20 17:48:42.957559359 +0000 UTC m=+41.733090158" watchObservedRunningTime="2026-04-20 17:48:42.959565288 +0000 UTC m=+41.735096082" Apr 20 17:48:42.973295 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.973233 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" podStartSLOduration=22.849345664 podStartE2EDuration="26.973214553s" podCreationTimestamp="2026-04-20 17:48:16 +0000 UTC" firstStartedPulling="2026-04-20 17:48:37.846138476 +0000 UTC m=+36.621669251" lastFinishedPulling="2026-04-20 17:48:41.970007354 +0000 UTC m=+40.745538140" observedRunningTime="2026-04-20 17:48:42.972573749 +0000 UTC m=+41.748104547" watchObservedRunningTime="2026-04-20 17:48:42.973214553 +0000 UTC m=+41.748745349" Apr 20 17:48:42.976638 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.976587 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:42.976778 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:42.976724 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:42.976778 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:42.976754 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:42.976901 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:42.976834 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:48:50.976815377 +0000 UTC m=+49.752346154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:42.976901 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:42.976857 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:42.976901 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:42.976868 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:42.976901 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:42.976902 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:50.976890772 +0000 UTC m=+49.752421545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:43.077748 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:43.077710 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:43.077911 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:43.077791 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:43.077911 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:43.077879 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:43.078014 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:43.077913 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:43.078014 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:43.077954 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:48:51.077931566 +0000 UTC m=+49.853462353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:43.078014 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:43.077984 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:51.077966189 +0000 UTC m=+49.853496966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:46.948356 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.948315 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" event={"ID":"c16e06c2-3dd3-4775-84f5-ec5d0058d281","Type":"ContainerStarted","Data":"427e9caed6bc0effb021e6c64c95395290d49453bf209a5b9307ba12e853fab2"} Apr 20 17:48:46.948822 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.948547 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:46.950353 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.950324 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:48:46.950353 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.950348 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerStarted","Data":"8a142e848e672bd8d1cde25b45b34a83b65957c0fd3eaec07aad2f779661ce64"} Apr 20 17:48:46.950484 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.950361 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerStarted","Data":"1d14d332e6b531043b5dbd1443989f8c667fdedc1c3056df7390450cef2b4d8a"} Apr 20 17:48:46.965414 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.965375 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" podStartSLOduration=22.766479429 podStartE2EDuration="30.965364459s" podCreationTimestamp="2026-04-20 17:48:16 +0000 UTC" firstStartedPulling="2026-04-20 17:48:37.845984982 +0000 UTC m=+36.621515762" lastFinishedPulling="2026-04-20 17:48:46.04487002 +0000 UTC m=+44.820400792" observedRunningTime="2026-04-20 17:48:46.964130294 +0000 UTC m=+45.739661102" watchObservedRunningTime="2026-04-20 17:48:46.965364459 +0000 UTC m=+45.740895251" Apr 20 17:48:46.998562 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:46.998502 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" podStartSLOduration=22.805414065 podStartE2EDuration="30.998484564s" podCreationTimestamp="2026-04-20 17:48:16 +0000 UTC" firstStartedPulling="2026-04-20 17:48:37.845984544 +0000 UTC m=+36.621515325" lastFinishedPulling="2026-04-20 17:48:46.03905505 +0000 UTC m=+44.814585824" observedRunningTime="2026-04-20 17:48:46.998305694 +0000 UTC m=+45.773836489" watchObservedRunningTime="2026-04-20 17:48:46.998484564 +0000 UTC m=+45.774015360" Apr 20 17:48:50.844566 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:50.844522 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:50.848033 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:50.848010 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9041de32-fafa-4935-a258-8c6ecce98d75-original-pull-secret\") pod \"global-pull-secret-syncer-xrxtp\" (UID: \"9041de32-fafa-4935-a258-8c6ecce98d75\") " pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:50.875082 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:50.875046 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xrxtp" Apr 20 17:48:50.989015 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:50.988981 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xrxtp"] Apr 20 17:48:50.992336 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:48:50.992299 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9041de32_fafa_4935_a258_8c6ecce98d75.slice/crio-ad151f06632c4ddcfda5b0d581516767831fbdc5b26e342f6a4e2497f28f7c30 WatchSource:0}: Error finding container ad151f06632c4ddcfda5b0d581516767831fbdc5b26e342f6a4e2497f28f7c30: Status 404 returned error can't find the container with id ad151f06632c4ddcfda5b0d581516767831fbdc5b26e342f6a4e2497f28f7c30 Apr 20 17:48:51.046297 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:51.046257 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:48:51.046429 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:51.046352 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:48:51.046429 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.046415 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:48:51.046509 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.046477 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.046462981 +0000 UTC m=+65.821993754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:48:51.046509 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.046491 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:48:51.046579 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.046512 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:48:51.046579 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.046570 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.04655663 +0000 UTC m=+65.822087403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:48:51.147127 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:51.147023 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:48:51.147285 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.147171 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:48:51.147285 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.147241 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.147224223 +0000 UTC m=+65.922755014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:48:51.147398 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:51.147283 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:48:51.147398 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.147374 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:48:51.147486 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:48:51.147434 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:49:07.147417158 +0000 UTC m=+65.922947933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:48:51.962380 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:51.962337 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xrxtp" event={"ID":"9041de32-fafa-4935-a258-8c6ecce98d75","Type":"ContainerStarted","Data":"ad151f06632c4ddcfda5b0d581516767831fbdc5b26e342f6a4e2497f28f7c30"} Apr 20 17:48:55.972807 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:55.972768 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xrxtp" event={"ID":"9041de32-fafa-4935-a258-8c6ecce98d75","Type":"ContainerStarted","Data":"c2dcd9a90f9580854e40d3c56e027a29293e61cb50421bfc1ebd56df1360275d"} Apr 20 17:48:59.904508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:59.904481 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lbls6" Apr 20 17:48:59.931156 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:48:59.931101 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xrxtp" podStartSLOduration=38.028104077 podStartE2EDuration="41.931085551s" podCreationTimestamp="2026-04-20 17:48:18 +0000 UTC" firstStartedPulling="2026-04-20 17:48:50.994085486 +0000 UTC m=+49.769616263" lastFinishedPulling="2026-04-20 17:48:54.897066964 +0000 UTC m=+53.672597737" observedRunningTime="2026-04-20 17:48:55.988323764 +0000 UTC m=+54.763854560" watchObservedRunningTime="2026-04-20 17:48:59.931085551 +0000 UTC m=+58.706616347" Apr 20 17:49:07.076745 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.076701 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.076813 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.076863 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.076881 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.076944 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:39.076926174 +0000 UTC m=+97.852456956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.076960 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:49:07.077143 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.077021 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:49:39.077002959 +0000 UTC m=+97.852533749 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:49:07.177539 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.177498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:49:07.177737 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.177556 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:49:07.177737 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.177664 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:07.177737 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.177725 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:39.177707807 +0000 UTC m=+97.953238580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:49:07.177852 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.177664 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:07.177852 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.177816 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:49:39.177801576 +0000 UTC m=+97.953332350 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:49:07.480393 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.480356 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:49:07.482867 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.482848 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 17:49:07.490821 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.490798 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:49:07.490905 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:07.490871 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:50:11.490854155 +0000 UTC m=+130.266384927 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : secret "metrics-daemon-secret" not found Apr 20 17:49:07.580814 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.580779 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:49:07.583684 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.583664 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 17:49:07.594284 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.594264 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 17:49:07.604669 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.604643 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnrn\" (UniqueName: \"kubernetes.io/projected/1496881d-daad-4889-957b-ae0b63332278-kube-api-access-tlnrn\") pod \"network-check-target-5228n\" (UID: \"1496881d-daad-4889-957b-ae0b63332278\") " pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:49:07.691115 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.691087 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-swr7p\"" Apr 20 17:49:07.699332 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.699313 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:49:07.818342 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:07.818310 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5228n"] Apr 20 17:49:07.820961 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:49:07.820929 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1496881d_daad_4889_957b_ae0b63332278.slice/crio-8495e431f37beb7629a2d71ed844761f6833e18afebc575bf07f8467776f94bc WatchSource:0}: Error finding container 8495e431f37beb7629a2d71ed844761f6833e18afebc575bf07f8467776f94bc: Status 404 returned error can't find the container with id 8495e431f37beb7629a2d71ed844761f6833e18afebc575bf07f8467776f94bc Apr 20 17:49:08.004712 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:08.004604 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5228n" event={"ID":"1496881d-daad-4889-957b-ae0b63332278","Type":"ContainerStarted","Data":"8495e431f37beb7629a2d71ed844761f6833e18afebc575bf07f8467776f94bc"} Apr 20 17:49:11.014368 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:11.014280 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5228n" event={"ID":"1496881d-daad-4889-957b-ae0b63332278","Type":"ContainerStarted","Data":"61d7f6d1fba53554811eb4ebd68df5ad6931fadb1342561c2f767cccec6b47dc"} Apr 20 17:49:11.014731 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:11.014405 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:49:11.029167 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:11.029112 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5228n" podStartSLOduration=67.113631586 podStartE2EDuration="1m10.029098136s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:49:07.822933699 +0000 UTC m=+66.598464472" lastFinishedPulling="2026-04-20 17:49:10.738400249 +0000 UTC m=+69.513931022" observedRunningTime="2026-04-20 17:49:11.028934824 +0000 UTC m=+69.804465622" watchObservedRunningTime="2026-04-20 17:49:11.029098136 +0000 UTC m=+69.804628932" Apr 20 17:49:39.127772 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:39.127732 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:39.127835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.127892 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.127911 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.127963 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.127993 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:43.12797583 +0000 UTC m=+161.903506620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:49:39.128203 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.128024 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:50:43.128010205 +0000 UTC m=+161.903540977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:49:39.229177 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:39.229127 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:49:39.229348 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:39.229232 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:49:39.229348 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.229269 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:39.229348 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.229328 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:39.229348 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.229338 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:43.229323477 +0000 UTC m=+162.004854250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:49:39.229547 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:49:39.229367 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:50:43.22935683 +0000 UTC m=+162.004887603 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:49:42.019544 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:49:42.019507 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5228n" Apr 20 17:50:11.580252 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:11.580206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:50:11.580873 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:11.580388 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:50:11.580873 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:11.580479 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs podName:c4102ca4-2dfd-487f-85a4-c91b3ae6797e nodeName:}" failed. No retries permitted until 2026-04-20 17:52:13.580455039 +0000 UTC m=+252.355985813 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs") pod "network-metrics-daemon-7gff7" (UID: "c4102ca4-2dfd-487f-85a4-c91b3ae6797e") : secret "metrics-daemon-secret" not found Apr 20 17:50:38.153451 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:38.153402 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" podUID="f82115f5-7c80-4334-9e2c-bf493509b8ca" Apr 20 17:50:38.214603 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:38.214574 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:50:38.220104 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:38.220062 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" podUID="48b860ba-98dc-4062-9c16-8621c7eed535" Apr 20 17:50:38.270508 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:38.270468 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p6j2d" podUID="9ada05b5-e0d5-449c-83d5-41ed76dac3ee" Apr 20 17:50:38.276606 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:38.276577 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tqj4h" podUID="baa515c8-8724-40c1-a30b-d562783453d5" Apr 20 17:50:39.216909 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:39.216881 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:50:39.216909 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:39.216909 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:50:39.217426 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:39.217024 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:50:39.782916 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:39.782862 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7gff7" podUID="c4102ca4-2dfd-487f-85a4-c91b3ae6797e" Apr 20 17:50:42.225103 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:42.225063 2581 generic.go:358] "Generic (PLEG): container finished" podID="488f7f9a-db93-4f84-b131-8e361a2591b8" containerID="dd7c3a424e039552d760ba019eb34159dc961cbc82e4890a8e8904fec812c260" exitCode=255 Apr 20 17:50:42.225497 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:42.225137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" event={"ID":"488f7f9a-db93-4f84-b131-8e361a2591b8","Type":"ContainerDied","Data":"dd7c3a424e039552d760ba019eb34159dc961cbc82e4890a8e8904fec812c260"} Apr 20 17:50:42.225497 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:42.225469 2581 scope.go:117] "RemoveContainer" containerID="dd7c3a424e039552d760ba019eb34159dc961cbc82e4890a8e8904fec812c260" Apr 20 17:50:43.228341 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:43.228241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") pod \"image-registry-5c47d9bcc6-8sbg6\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:50:43.228341 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:43.228317 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.228346 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.228363 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c47d9bcc6-8sbg6: secret "image-registry-tls" not found Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.228420 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.228441 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls podName:48b860ba-98dc-4062-9c16-8621c7eed535 nodeName:}" failed. No retries permitted until 2026-04-20 17:52:45.228417697 +0000 UTC m=+284.003948488 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls") pod "image-registry-5c47d9bcc6-8sbg6" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535") : secret "image-registry-tls" not found Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.228488 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert podName:f82115f5-7c80-4334-9e2c-bf493509b8ca nodeName:}" failed. No retries permitted until 2026-04-20 17:52:45.228464644 +0000 UTC m=+284.003995420 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-z92d2" (UID: "f82115f5-7c80-4334-9e2c-bf493509b8ca") : secret "networking-console-plugin-cert" not found Apr 20 17:50:43.228914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:43.228895 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-88bb7c64f-pl5kb" event={"ID":"488f7f9a-db93-4f84-b131-8e361a2591b8","Type":"ContainerStarted","Data":"901729986de12f92cfd5966d864690b8664f6ad07a56a54a46521c3d35076cc1"} Apr 20 17:50:43.328872 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:43.328830 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:50:43.329047 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:43.328900 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:50:43.329047 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.328983 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:50:43.329125 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.329044 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:50:43.329125 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.329050 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls podName:9ada05b5-e0d5-449c-83d5-41ed76dac3ee nodeName:}" failed. No retries permitted until 2026-04-20 17:52:45.329031331 +0000 UTC m=+284.104562108 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls") pod "dns-default-p6j2d" (UID: "9ada05b5-e0d5-449c-83d5-41ed76dac3ee") : secret "dns-default-metrics-tls" not found Apr 20 17:50:43.329208 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:50:43.329127 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert podName:baa515c8-8724-40c1-a30b-d562783453d5 nodeName:}" failed. No retries permitted until 2026-04-20 17:52:45.329098736 +0000 UTC m=+284.104629510 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert") pod "ingress-canary-tqj4h" (UID: "baa515c8-8724-40c1-a30b-d562783453d5") : secret "canary-serving-cert" not found Apr 20 17:50:46.237236 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:46.237203 2581 generic.go:358] "Generic (PLEG): container finished" podID="c16e06c2-3dd3-4775-84f5-ec5d0058d281" containerID="427e9caed6bc0effb021e6c64c95395290d49453bf209a5b9307ba12e853fab2" exitCode=1 Apr 20 17:50:46.237664 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:46.237261 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" event={"ID":"c16e06c2-3dd3-4775-84f5-ec5d0058d281","Type":"ContainerDied","Data":"427e9caed6bc0effb021e6c64c95395290d49453bf209a5b9307ba12e853fab2"} Apr 20 17:50:46.237664 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:46.237595 2581 scope.go:117] "RemoveContainer" containerID="427e9caed6bc0effb021e6c64c95395290d49453bf209a5b9307ba12e853fab2" Apr 20 17:50:46.371450 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:46.371419 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n9gmk_a1c5a7fc-92d7-49a4-85c8-128fe8e46b19/dns-node-resolver/0.log" Apr 20 17:50:46.948724 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:46.948690 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:50:47.172187 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:47.172161 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g8b7h_1a64b0a7-defc-4ef8-b833-3e4b069784b3/node-ca/0.log" Apr 20 17:50:47.240825 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:47.240734 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" event={"ID":"c16e06c2-3dd3-4775-84f5-ec5d0058d281","Type":"ContainerStarted","Data":"fd6537ecbaa2ec743fdce03fe48919a40ee493d48b7fe6c4d85b8661d2ced55f"} Apr 20 17:50:47.241211 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:47.240981 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:50:47.241591 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:47.241575 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6ddf7b59f4-s2xrb" Apr 20 17:50:52.762240 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:50:52.762194 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:51:07.147266 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.147233 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-txgpz"] Apr 20 17:51:07.150272 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.150256 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.152987 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.152960 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 17:51:07.153118 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.153018 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvbdx\"" Apr 20 17:51:07.154205 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.154084 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 17:51:07.154323 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.154097 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 17:51:07.154323 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.154100 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 17:51:07.161765 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.161737 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-txgpz"] Apr 20 17:51:07.210018 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.209984 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7229afca-8988-40d4-85ba-ee0d637055e8-crio-socket\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.210229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.210058 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hs5\" (UniqueName: \"kubernetes.io/projected/7229afca-8988-40d4-85ba-ee0d637055e8-kube-api-access-g8hs5\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.210229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.210088 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7229afca-8988-40d4-85ba-ee0d637055e8-data-volume\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.210229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.210167 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7229afca-8988-40d4-85ba-ee0d637055e8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.210229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.210208 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7229afca-8988-40d4-85ba-ee0d637055e8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.310946 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.310908 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7229afca-8988-40d4-85ba-ee0d637055e8-crio-socket\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311144 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.310973 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hs5\" (UniqueName: \"kubernetes.io/projected/7229afca-8988-40d4-85ba-ee0d637055e8-kube-api-access-g8hs5\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311144 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311002 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7229afca-8988-40d4-85ba-ee0d637055e8-data-volume\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311144 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7229afca-8988-40d4-85ba-ee0d637055e8-crio-socket\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311144 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311099 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7229afca-8988-40d4-85ba-ee0d637055e8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311144 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311143 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7229afca-8988-40d4-85ba-ee0d637055e8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311412 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311328 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7229afca-8988-40d4-85ba-ee0d637055e8-data-volume\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.311776 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.311751 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7229afca-8988-40d4-85ba-ee0d637055e8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.313462 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.313439 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7229afca-8988-40d4-85ba-ee0d637055e8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.318983 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.318961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hs5\" (UniqueName: \"kubernetes.io/projected/7229afca-8988-40d4-85ba-ee0d637055e8-kube-api-access-g8hs5\") pod \"insights-runtime-extractor-txgpz\" (UID: \"7229afca-8988-40d4-85ba-ee0d637055e8\") " pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.459792 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.459752 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-txgpz" Apr 20 17:51:07.580890 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:07.580859 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-txgpz"] Apr 20 17:51:07.584262 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:51:07.584233 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7229afca_8988_40d4_85ba_ee0d637055e8.slice/crio-7cdaa1d87e126f26211544cb743a0be9d7a157085d35752a61c03cff12d9decb WatchSource:0}: Error finding container 7cdaa1d87e126f26211544cb743a0be9d7a157085d35752a61c03cff12d9decb: Status 404 returned error can't find the container with id 7cdaa1d87e126f26211544cb743a0be9d7a157085d35752a61c03cff12d9decb Apr 20 17:51:08.291290 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:08.291252 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-txgpz" event={"ID":"7229afca-8988-40d4-85ba-ee0d637055e8","Type":"ContainerStarted","Data":"88aad122ae7925534bf542d991d680c27bb6657ac67b110e80228738d360c80c"} Apr 20 17:51:08.291717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:08.291298 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-txgpz" event={"ID":"7229afca-8988-40d4-85ba-ee0d637055e8","Type":"ContainerStarted","Data":"7cdaa1d87e126f26211544cb743a0be9d7a157085d35752a61c03cff12d9decb"} Apr 20 17:51:09.296086 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:09.296047 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-txgpz" event={"ID":"7229afca-8988-40d4-85ba-ee0d637055e8","Type":"ContainerStarted","Data":"e89ab14ce5991d09f281c31b4a8d9a48ff9e0f3238761edccdabed05d7f49b40"} Apr 20 17:51:10.300014 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:10.299975 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-txgpz" event={"ID":"7229afca-8988-40d4-85ba-ee0d637055e8","Type":"ContainerStarted","Data":"18186fccf122c7896f41dfb1e948a8ed3e663c745603e8111e1874d7c221a4f4"} Apr 20 17:51:22.268248 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.268197 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-txgpz" podStartSLOduration=13.018242837 podStartE2EDuration="15.268181243s" podCreationTimestamp="2026-04-20 17:51:07 +0000 UTC" firstStartedPulling="2026-04-20 17:51:07.634393438 +0000 UTC m=+186.409924211" lastFinishedPulling="2026-04-20 17:51:09.884331831 +0000 UTC m=+188.659862617" observedRunningTime="2026-04-20 17:51:10.320565557 +0000 UTC m=+189.096096353" watchObservedRunningTime="2026-04-20 17:51:22.268181243 +0000 UTC m=+201.043712038" Apr 20 17:51:22.268759 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.268541 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lcpnp"] Apr 20 17:51:22.271893 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.271877 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.276857 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.276593 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 17:51:22.276857 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.276609 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 17:51:22.277604 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.277574 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 17:51:22.277773 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.277638 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75n4m\"" Apr 20 17:51:22.277773 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.277643 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 17:51:22.277773 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.277683 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 17:51:22.277978 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.277819 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 17:51:22.427683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-root\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-wtmp\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427737 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-metrics-client-ca\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427754 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427779 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-sys\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427796 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.427853 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427820 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-textfile\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.428037 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.428037 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.427916 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gnwb\" (UniqueName: \"kubernetes.io/projected/4dbb2025-5637-4338-971e-382de0d4f73b-kube-api-access-4gnwb\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528346 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528265 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528492 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528410 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gnwb\" (UniqueName: \"kubernetes.io/projected/4dbb2025-5637-4338-971e-382de0d4f73b-kube-api-access-4gnwb\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528492 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528444 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-root\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528544 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-root\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528693 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-wtmp\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528758 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528727 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-metrics-client-ca\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528810 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528762 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528863 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528810 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-sys\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528863 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528840 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.528863 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528851 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-wtmp\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.529006 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528877 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-textfile\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.529006 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:51:22.528911 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 17:51:22.529006 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.528913 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbb2025-5637-4338-971e-382de0d4f73b-sys\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.529006 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:51:22.528995 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls podName:4dbb2025-5637-4338-971e-382de0d4f73b nodeName:}" failed. No retries permitted until 2026-04-20 17:51:23.02896915 +0000 UTC m=+201.804500140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls") pod "node-exporter-lcpnp" (UID: "4dbb2025-5637-4338-971e-382de0d4f73b") : secret "node-exporter-tls" not found Apr 20 17:51:22.529262 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.529241 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-textfile\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.529388 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.529357 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-metrics-client-ca\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.529463 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.529439 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.530660 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.530638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:22.537738 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:22.537717 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gnwb\" (UniqueName: \"kubernetes.io/projected/4dbb2025-5637-4338-971e-382de0d4f73b-kube-api-access-4gnwb\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:23.034182 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:23.034144 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:23.036420 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:23.036400 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4dbb2025-5637-4338-971e-382de0d4f73b-node-exporter-tls\") pod \"node-exporter-lcpnp\" (UID: \"4dbb2025-5637-4338-971e-382de0d4f73b\") " pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:23.183079 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:23.183045 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lcpnp" Apr 20 17:51:23.191021 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:51:23.190978 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbb2025_5637_4338_971e_382de0d4f73b.slice/crio-3498fd5b6896c4cc047b873f82d3a4f16f89f2a7fdf7efaf7d2e30a6bdae80f0 WatchSource:0}: Error finding container 3498fd5b6896c4cc047b873f82d3a4f16f89f2a7fdf7efaf7d2e30a6bdae80f0: Status 404 returned error can't find the container with id 3498fd5b6896c4cc047b873f82d3a4f16f89f2a7fdf7efaf7d2e30a6bdae80f0 Apr 20 17:51:23.331224 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:23.331127 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcpnp" event={"ID":"4dbb2025-5637-4338-971e-382de0d4f73b","Type":"ContainerStarted","Data":"3498fd5b6896c4cc047b873f82d3a4f16f89f2a7fdf7efaf7d2e30a6bdae80f0"} Apr 20 17:51:24.334571 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:24.334479 2581 generic.go:358] "Generic (PLEG): container finished" podID="4dbb2025-5637-4338-971e-382de0d4f73b" containerID="95df578de4bfeda369189d9460c8b3ef4035184a7b62680efbea9a457fe73892" exitCode=0 Apr 20 17:51:24.334571 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:24.334547 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcpnp" event={"ID":"4dbb2025-5637-4338-971e-382de0d4f73b","Type":"ContainerDied","Data":"95df578de4bfeda369189d9460c8b3ef4035184a7b62680efbea9a457fe73892"} Apr 20 17:51:25.339036 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:25.338999 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcpnp" event={"ID":"4dbb2025-5637-4338-971e-382de0d4f73b","Type":"ContainerStarted","Data":"068ba4cd673a492dc7339e3aa5d86fcdac75f9228bd846c7a151f484f91854f7"} Apr 20 17:51:25.339036 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:25.339035 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcpnp" event={"ID":"4dbb2025-5637-4338-971e-382de0d4f73b","Type":"ContainerStarted","Data":"a85401cd6eea53409c8d0e184bb6b78be5e60c3b4658a492f3e5b562adeb43dd"} Apr 20 17:51:25.361664 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:25.361597 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lcpnp" podStartSLOduration=2.519187015 podStartE2EDuration="3.361581587s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:23.192876396 +0000 UTC m=+201.968407169" lastFinishedPulling="2026-04-20 17:51:24.035270961 +0000 UTC m=+202.810801741" observedRunningTime="2026-04-20 17:51:25.360962137 +0000 UTC m=+204.136492933" watchObservedRunningTime="2026-04-20 17:51:25.361581587 +0000 UTC m=+204.137112411" Apr 20 17:51:25.537874 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:25.537817 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" podUID="1d73585e-0790-49af-85fe-84de1111a4e8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 17:51:29.150829 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.150785 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c47d9bcc6-8sbg6"] Apr 20 17:51:29.151339 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:51:29.151060 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" podUID="48b860ba-98dc-4062-9c16-8621c7eed535" Apr 20 17:51:29.348855 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.348825 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:51:29.353022 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.352998 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:51:29.487371 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487276 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58db5\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487371 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487321 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487371 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487344 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487371 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487363 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487713 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487415 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487713 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487433 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487713 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487463 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets\") pod \"48b860ba-98dc-4062-9c16-8621c7eed535\" (UID: \"48b860ba-98dc-4062-9c16-8621c7eed535\") " Apr 20 17:51:29.487713 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487605 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:51:29.487911 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487807 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:51:29.487951 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.487913 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:51:29.489773 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.489744 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:51:29.489881 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.489767 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:51:29.489881 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.489798 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:51:29.489881 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.489834 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5" (OuterVolumeSpecName: "kube-api-access-58db5") pod "48b860ba-98dc-4062-9c16-8621c7eed535" (UID: "48b860ba-98dc-4062-9c16-8621c7eed535"). InnerVolumeSpecName "kube-api-access-58db5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:51:29.588584 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588538 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-image-registry-private-configuration\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588584 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588575 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-registry-certificates\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588584 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588589 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48b860ba-98dc-4062-9c16-8621c7eed535-installation-pull-secrets\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588860 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588601 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58db5\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-kube-api-access-58db5\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588860 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588615 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-bound-sa-token\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588860 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588649 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48b860ba-98dc-4062-9c16-8621c7eed535-ca-trust-extracted\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:29.588860 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:29.588661 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48b860ba-98dc-4062-9c16-8621c7eed535-trusted-ca\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:30.350845 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:30.350812 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c47d9bcc6-8sbg6" Apr 20 17:51:30.384876 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:30.384842 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c47d9bcc6-8sbg6"] Apr 20 17:51:30.388367 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:30.388343 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c47d9bcc6-8sbg6"] Apr 20 17:51:30.496130 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:30.496094 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48b860ba-98dc-4062-9c16-8621c7eed535-registry-tls\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:51:31.764519 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:31.764485 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b860ba-98dc-4062-9c16-8621c7eed535" path="/var/lib/kubelet/pods/48b860ba-98dc-4062-9c16-8621c7eed535/volumes" Apr 20 17:51:35.537467 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:35.537431 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" podUID="1d73585e-0790-49af-85fe-84de1111a4e8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 17:51:45.538135 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:45.538091 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" podUID="1d73585e-0790-49af-85fe-84de1111a4e8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 17:51:45.538494 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:45.538175 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" Apr 20 17:51:45.538720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:45.538689 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8a142e848e672bd8d1cde25b45b34a83b65957c0fd3eaec07aad2f779661ce64"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 17:51:45.538793 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:45.538777 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" podUID="1d73585e-0790-49af-85fe-84de1111a4e8" containerName="service-proxy" containerID="cri-o://8a142e848e672bd8d1cde25b45b34a83b65957c0fd3eaec07aad2f779661ce64" gracePeriod=30 Apr 20 17:51:46.390959 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:46.390925 2581 generic.go:358] "Generic (PLEG): container finished" podID="1d73585e-0790-49af-85fe-84de1111a4e8" containerID="8a142e848e672bd8d1cde25b45b34a83b65957c0fd3eaec07aad2f779661ce64" exitCode=2 Apr 20 17:51:46.390959 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:46.390967 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerDied","Data":"8a142e848e672bd8d1cde25b45b34a83b65957c0fd3eaec07aad2f779661ce64"} Apr 20 17:51:46.391170 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:51:46.390994 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c6976b-b646d" event={"ID":"1d73585e-0790-49af-85fe-84de1111a4e8","Type":"ContainerStarted","Data":"6f88b3588cde88d3823e7cb93193e5e38abf8edc9c9e6c97508745873c0bd9b1"} Apr 20 17:52:13.628429 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:13.628332 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:52:13.630636 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:13.630599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4102ca4-2dfd-487f-85a4-c91b3ae6797e-metrics-certs\") pod \"network-metrics-daemon-7gff7\" (UID: \"c4102ca4-2dfd-487f-85a4-c91b3ae6797e\") " pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:52:13.765812 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:13.765785 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:52:13.773938 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:13.773920 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gff7" Apr 20 17:52:13.889013 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:13.888841 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7gff7"] Apr 20 17:52:13.891600 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:52:13.891573 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4102ca4_2dfd_487f_85a4_c91b3ae6797e.slice/crio-a38dce919326732b5aeb725b0ec6cd95e39e9b2bbd8b0594a09229bb902a9d13 WatchSource:0}: Error finding container a38dce919326732b5aeb725b0ec6cd95e39e9b2bbd8b0594a09229bb902a9d13: Status 404 returned error can't find the container with id a38dce919326732b5aeb725b0ec6cd95e39e9b2bbd8b0594a09229bb902a9d13 Apr 20 17:52:14.462597 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:14.462553 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gff7" event={"ID":"c4102ca4-2dfd-487f-85a4-c91b3ae6797e","Type":"ContainerStarted","Data":"a38dce919326732b5aeb725b0ec6cd95e39e9b2bbd8b0594a09229bb902a9d13"} Apr 20 17:52:15.469635 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:15.469584 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gff7" event={"ID":"c4102ca4-2dfd-487f-85a4-c91b3ae6797e","Type":"ContainerStarted","Data":"039a953a0bf251664e7f0ca6bb67704278373e8a21792eef7b281009e8faefec"} Apr 20 17:52:15.469635 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:15.469638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gff7" event={"ID":"c4102ca4-2dfd-487f-85a4-c91b3ae6797e","Type":"ContainerStarted","Data":"6947505e1147f1b8d64ea8598c7c31894b3ed88206fbcf7e05fc8cffa57f2dda"} Apr 20 17:52:15.488539 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:15.488494 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7gff7" podStartSLOduration=253.553456388 podStartE2EDuration="4m14.488480141s" podCreationTimestamp="2026-04-20 17:48:01 +0000 UTC" firstStartedPulling="2026-04-20 17:52:13.895573275 +0000 UTC m=+252.671104055" lastFinishedPulling="2026-04-20 17:52:14.830597034 +0000 UTC m=+253.606127808" observedRunningTime="2026-04-20 17:52:15.486997084 +0000 UTC m=+254.262527878" watchObservedRunningTime="2026-04-20 17:52:15.488480141 +0000 UTC m=+254.264010935" Apr 20 17:52:41.216040 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:52:41.215974 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" podUID="f82115f5-7c80-4334-9e2c-bf493509b8ca" Apr 20 17:52:41.540285 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:41.540203 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:52:42.218416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:52:42.218373 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tqj4h" podUID="baa515c8-8724-40c1-a30b-d562783453d5" Apr 20 17:52:42.218416 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:52:42.218392 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p6j2d" podUID="9ada05b5-e0d5-449c-83d5-41ed76dac3ee" Apr 20 17:52:42.542478 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:42.542394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:52:42.542647 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:42.542394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:52:45.272445 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.272396 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:52:45.274942 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.274912 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f82115f5-7c80-4334-9e2c-bf493509b8ca-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-z92d2\" (UID: \"f82115f5-7c80-4334-9e2c-bf493509b8ca\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:52:45.373595 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.373557 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:52:45.373595 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.373604 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:52:45.375995 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.375970 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ada05b5-e0d5-449c-83d5-41ed76dac3ee-metrics-tls\") pod \"dns-default-p6j2d\" (UID: \"9ada05b5-e0d5-449c-83d5-41ed76dac3ee\") " pod="openshift-dns/dns-default-p6j2d" Apr 20 17:52:45.376134 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.376117 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa515c8-8724-40c1-a30b-d562783453d5-cert\") pod \"ingress-canary-tqj4h\" (UID: \"baa515c8-8724-40c1-a30b-d562783453d5\") " pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:52:45.443979 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.443942 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8tm8c\"" Apr 20 17:52:45.451249 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.451212 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" Apr 20 17:52:45.546391 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.546352 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:52:45.546391 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.546390 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:52:45.554100 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.554074 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tqj4h" Apr 20 17:52:45.554100 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.554099 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:52:45.569134 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.569106 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-z92d2"] Apr 20 17:52:45.572803 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:52:45.572768 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82115f5_7c80_4334_9e2c_bf493509b8ca.slice/crio-d3c339a4804fbe17b32c9d3ab91b80289bddb55397b81d81c9a93da44d7eb201 WatchSource:0}: Error finding container d3c339a4804fbe17b32c9d3ab91b80289bddb55397b81d81c9a93da44d7eb201: Status 404 returned error can't find the container with id d3c339a4804fbe17b32c9d3ab91b80289bddb55397b81d81c9a93da44d7eb201 Apr 20 17:52:45.682717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.681161 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tqj4h"] Apr 20 17:52:45.686404 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:52:45.686377 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa515c8_8724_40c1_a30b_d562783453d5.slice/crio-bef01cc5fa60bf214ceb7942132b03064690760ce2bd2ea0380b407540609d5c WatchSource:0}: Error finding container bef01cc5fa60bf214ceb7942132b03064690760ce2bd2ea0380b407540609d5c: Status 404 returned error can't find the container with id bef01cc5fa60bf214ceb7942132b03064690760ce2bd2ea0380b407540609d5c Apr 20 17:52:45.701199 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:45.701161 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6j2d"] Apr 20 17:52:45.704149 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:52:45.704116 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ada05b5_e0d5_449c_83d5_41ed76dac3ee.slice/crio-1c21b76076ccd2e870da566ad12c8ac46802f6bf0d708b6dd0d262d0e2b7686a WatchSource:0}: Error finding container 1c21b76076ccd2e870da566ad12c8ac46802f6bf0d708b6dd0d262d0e2b7686a: Status 404 returned error can't find the container with id 1c21b76076ccd2e870da566ad12c8ac46802f6bf0d708b6dd0d262d0e2b7686a Apr 20 17:52:46.554491 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:46.554423 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tqj4h" event={"ID":"baa515c8-8724-40c1-a30b-d562783453d5","Type":"ContainerStarted","Data":"bef01cc5fa60bf214ceb7942132b03064690760ce2bd2ea0380b407540609d5c"} Apr 20 17:52:46.556130 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:46.556097 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6j2d" event={"ID":"9ada05b5-e0d5-449c-83d5-41ed76dac3ee","Type":"ContainerStarted","Data":"1c21b76076ccd2e870da566ad12c8ac46802f6bf0d708b6dd0d262d0e2b7686a"} Apr 20 17:52:46.557246 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:46.557208 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" event={"ID":"f82115f5-7c80-4334-9e2c-bf493509b8ca","Type":"ContainerStarted","Data":"d3c339a4804fbe17b32c9d3ab91b80289bddb55397b81d81c9a93da44d7eb201"} Apr 20 17:52:47.561736 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:47.561683 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" event={"ID":"f82115f5-7c80-4334-9e2c-bf493509b8ca","Type":"ContainerStarted","Data":"daa5ff379d32e1bd9cbe3956452026f6de453812a48ac479c4bba1fafe1f4b96"} Apr 20 17:52:47.588559 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:47.588484 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-z92d2" podStartSLOduration=281.468647643 podStartE2EDuration="4m42.588461017s" podCreationTimestamp="2026-04-20 17:48:05 +0000 UTC" firstStartedPulling="2026-04-20 17:52:45.574689331 +0000 UTC m=+284.350220105" lastFinishedPulling="2026-04-20 17:52:46.694502699 +0000 UTC m=+285.470033479" observedRunningTime="2026-04-20 17:52:47.588225588 +0000 UTC m=+286.363756384" watchObservedRunningTime="2026-04-20 17:52:47.588461017 +0000 UTC m=+286.363991812" Apr 20 17:52:48.568096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:48.568057 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tqj4h" event={"ID":"baa515c8-8724-40c1-a30b-d562783453d5","Type":"ContainerStarted","Data":"a50d9ea9842d63b68e23c8ab2714c0b5b08344c41510cdda132171d50decbfd0"} Apr 20 17:52:48.569547 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:48.569522 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6j2d" event={"ID":"9ada05b5-e0d5-449c-83d5-41ed76dac3ee","Type":"ContainerStarted","Data":"722ee9b71c1b580c4a51a3ea04656c0dc3c094b50bacdd9c315000256fe1bee0"} Apr 20 17:52:48.569547 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:48.569549 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6j2d" event={"ID":"9ada05b5-e0d5-449c-83d5-41ed76dac3ee","Type":"ContainerStarted","Data":"f0064bba7976aed7c330b80a6d19042727d559defe2652c5765ad0465afcc04c"} Apr 20 17:52:48.583647 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:48.583580 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tqj4h" podStartSLOduration=251.584575375 podStartE2EDuration="4m13.583564512s" podCreationTimestamp="2026-04-20 17:48:35 +0000 UTC" firstStartedPulling="2026-04-20 17:52:45.688740069 +0000 UTC m=+284.464270843" lastFinishedPulling="2026-04-20 17:52:47.687729206 +0000 UTC m=+286.463259980" observedRunningTime="2026-04-20 17:52:48.583049309 +0000 UTC m=+287.358580106" watchObservedRunningTime="2026-04-20 17:52:48.583564512 +0000 UTC m=+287.359095308" Apr 20 17:52:48.600004 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:48.599946 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p6j2d" podStartSLOduration=251.622507085 podStartE2EDuration="4m13.599926782s" podCreationTimestamp="2026-04-20 17:48:35 +0000 UTC" firstStartedPulling="2026-04-20 17:52:45.705867862 +0000 UTC m=+284.481398635" lastFinishedPulling="2026-04-20 17:52:47.683287556 +0000 UTC m=+286.458818332" observedRunningTime="2026-04-20 17:52:48.599276766 +0000 UTC m=+287.374807561" watchObservedRunningTime="2026-04-20 17:52:48.599926782 +0000 UTC m=+287.375457578" Apr 20 17:52:49.573386 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:49.573349 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:52:59.577726 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:52:59.577694 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p6j2d" Apr 20 17:53:01.653514 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:53:01.653488 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 17:55:42.035600 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.035564 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5"] Apr 20 17:55:42.038608 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.038591 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.041578 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.041549 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 17:55:42.041718 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.041578 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-588z5\"" Apr 20 17:55:42.041718 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.041551 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:55:42.046770 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.046746 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5"] Apr 20 17:55:42.111255 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.111216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97knx\" (UniqueName: \"kubernetes.io/projected/39f7cf9c-c077-46d9-b01b-ef43218f5211-kube-api-access-97knx\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.111438 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.111272 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f7cf9c-c077-46d9-b01b-ef43218f5211-tmp\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.211662 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.211605 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f7cf9c-c077-46d9-b01b-ef43218f5211-tmp\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.211816 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.211683 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97knx\" (UniqueName: \"kubernetes.io/projected/39f7cf9c-c077-46d9-b01b-ef43218f5211-kube-api-access-97knx\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.212010 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.211991 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f7cf9c-c077-46d9-b01b-ef43218f5211-tmp\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.219872 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.219842 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97knx\" (UniqueName: \"kubernetes.io/projected/39f7cf9c-c077-46d9-b01b-ef43218f5211-kube-api-access-97knx\") pod \"openshift-lws-operator-bfc7f696d-fwcp5\" (UID: \"39f7cf9c-c077-46d9-b01b-ef43218f5211\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.348447 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.348355 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" Apr 20 17:55:42.471044 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.471012 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5"] Apr 20 17:55:42.474321 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:55:42.474285 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f7cf9c_c077_46d9_b01b_ef43218f5211.slice/crio-25db7dab04d6d2d159f40437f2d226d71880d928b98d69dc544003a16dffa4a6 WatchSource:0}: Error finding container 25db7dab04d6d2d159f40437f2d226d71880d928b98d69dc544003a16dffa4a6: Status 404 returned error can't find the container with id 25db7dab04d6d2d159f40437f2d226d71880d928b98d69dc544003a16dffa4a6 Apr 20 17:55:42.475798 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:42.475778 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:55:43.006406 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:43.006372 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" event={"ID":"39f7cf9c-c077-46d9-b01b-ef43218f5211","Type":"ContainerStarted","Data":"25db7dab04d6d2d159f40437f2d226d71880d928b98d69dc544003a16dffa4a6"} Apr 20 17:55:46.016877 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:46.016841 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" event={"ID":"39f7cf9c-c077-46d9-b01b-ef43218f5211","Type":"ContainerStarted","Data":"1df0e220ffa8c05bf55c20422518452a1d3c3ac2a3351bfd183b67a65c052fae"} Apr 20 17:55:46.035972 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:55:46.035903 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-fwcp5" podStartSLOduration=1.289211718 podStartE2EDuration="4.035883032s" podCreationTimestamp="2026-04-20 17:55:42 +0000 UTC" firstStartedPulling="2026-04-20 17:55:42.475940284 +0000 UTC m=+461.251471058" lastFinishedPulling="2026-04-20 17:55:45.222611599 +0000 UTC m=+463.998142372" observedRunningTime="2026-04-20 17:55:46.032493393 +0000 UTC m=+464.808024195" watchObservedRunningTime="2026-04-20 17:55:46.035883032 +0000 UTC m=+464.811413828" Apr 20 17:56:01.792202 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.792163 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss"] Apr 20 17:56:01.798476 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.798453 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.801345 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.801319 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 17:56:01.801527 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.801322 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 17:56:01.801527 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.801430 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 17:56:01.801527 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.801472 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-sc5qn\"" Apr 20 17:56:01.802560 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.802541 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 17:56:01.809728 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.809705 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss"] Apr 20 17:56:01.854451 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.854414 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.854634 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.854461 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5nw\" (UniqueName: \"kubernetes.io/projected/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-kube-api-access-gl5nw\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.854634 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.854520 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.954842 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.954793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.954842 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.954840 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.955109 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.954873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl5nw\" (UniqueName: \"kubernetes.io/projected/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-kube-api-access-gl5nw\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.957402 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.957360 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.957508 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.957487 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:01.963876 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:01.963852 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl5nw\" (UniqueName: \"kubernetes.io/projected/883ebf5e-88bc-4638-8ef1-8d3bf51e49ad-kube-api-access-gl5nw\") pod \"opendatahub-operator-controller-manager-b8c4c7886-4f5ss\" (UID: \"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:02.107914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:02.107810 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:02.231553 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:02.231518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss"] Apr 20 17:56:02.234977 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:02.234949 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883ebf5e_88bc_4638_8ef1_8d3bf51e49ad.slice/crio-2e849a8ca10cbcffcce078ff8a986cf4e13f9c8307a6dfe0b1f0259e25f02bfc WatchSource:0}: Error finding container 2e849a8ca10cbcffcce078ff8a986cf4e13f9c8307a6dfe0b1f0259e25f02bfc: Status 404 returned error can't find the container with id 2e849a8ca10cbcffcce078ff8a986cf4e13f9c8307a6dfe0b1f0259e25f02bfc Apr 20 17:56:03.060560 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:03.060511 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" event={"ID":"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad","Type":"ContainerStarted","Data":"2e849a8ca10cbcffcce078ff8a986cf4e13f9c8307a6dfe0b1f0259e25f02bfc"} Apr 20 17:56:05.068066 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:05.068033 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" event={"ID":"883ebf5e-88bc-4638-8ef1-8d3bf51e49ad","Type":"ContainerStarted","Data":"fbf5e585f7efeb00eac512ca00c35e0088ae49ea94d160066904e6654876bc9d"} Apr 20 17:56:05.068442 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:05.068167 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:05.092041 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:05.091987 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" podStartSLOduration=1.362970655 podStartE2EDuration="4.091969616s" podCreationTimestamp="2026-04-20 17:56:01 +0000 UTC" firstStartedPulling="2026-04-20 17:56:02.236731712 +0000 UTC m=+481.012262484" lastFinishedPulling="2026-04-20 17:56:04.965730661 +0000 UTC m=+483.741261445" observedRunningTime="2026-04-20 17:56:05.090932019 +0000 UTC m=+483.866462838" watchObservedRunningTime="2026-04-20 17:56:05.091969616 +0000 UTC m=+483.867500411" Apr 20 17:56:16.073243 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:16.073212 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-4f5ss" Apr 20 17:56:18.999513 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:18.999474 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4"] Apr 20 17:56:19.002596 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.002577 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.005333 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.005310 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 17:56:19.007096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.006935 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 17:56:19.007096 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.007067 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 17:56:19.007290 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.007066 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 17:56:19.008058 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.007890 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-mfcz2\"" Apr 20 17:56:19.012422 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.012396 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4"] Apr 20 17:56:19.081999 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.081953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.081999 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.082008 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tmp\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.082217 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.082063 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbt84\" (UniqueName: \"kubernetes.io/projected/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-kube-api-access-lbt84\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.183110 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.183066 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.183110 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.183116 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tmp\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.183322 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.183151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbt84\" (UniqueName: \"kubernetes.io/projected/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-kube-api-access-lbt84\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.185511 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.185484 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tmp\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.185664 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.185639 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-tls-certs\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.192042 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.192013 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbt84\" (UniqueName: \"kubernetes.io/projected/d4cf93b3-09a3-4d10-997c-5b3c897b6be6-kube-api-access-lbt84\") pod \"kube-auth-proxy-554dd5dd7d-xg5d4\" (UID: \"d4cf93b3-09a3-4d10-997c-5b3c897b6be6\") " pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.316823 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.316735 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" Apr 20 17:56:19.445203 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:19.445168 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4"] Apr 20 17:56:19.448120 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:19.448089 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4cf93b3_09a3_4d10_997c_5b3c897b6be6.slice/crio-a126c63b97974f7e0057c4dbc147fa40663fed1847fa7a00cc861c61a8a1ad41 WatchSource:0}: Error finding container a126c63b97974f7e0057c4dbc147fa40663fed1847fa7a00cc861c61a8a1ad41: Status 404 returned error can't find the container with id a126c63b97974f7e0057c4dbc147fa40663fed1847fa7a00cc861c61a8a1ad41 Apr 20 17:56:20.111186 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:20.111136 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" event={"ID":"d4cf93b3-09a3-4d10-997c-5b3c897b6be6","Type":"ContainerStarted","Data":"a126c63b97974f7e0057c4dbc147fa40663fed1847fa7a00cc861c61a8a1ad41"} Apr 20 17:56:22.583544 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.583507 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-lnffx"] Apr 20 17:56:22.587683 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.587664 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:22.593762 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.590487 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 17:56:22.593762 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.590611 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-m68qx\"" Apr 20 17:56:22.598245 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.598219 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-lnffx"] Apr 20 17:56:22.712984 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.712935 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmqlb\" (UniqueName: \"kubernetes.io/projected/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-kube-api-access-mmqlb\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:22.713175 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.713070 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:22.813750 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.813655 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:22.813750 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.813713 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmqlb\" (UniqueName: \"kubernetes.io/projected/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-kube-api-access-mmqlb\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:22.813966 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:22.813827 2581 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 17:56:22.813966 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:22.813905 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert podName:5c45cee7-7c81-4c6f-8550-4943f0ec1b33 nodeName:}" failed. No retries permitted until 2026-04-20 17:56:23.313884629 +0000 UTC m=+502.089415416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert") pod "odh-model-controller-858dbf95b8-lnffx" (UID: "5c45cee7-7c81-4c6f-8550-4943f0ec1b33") : secret "odh-model-controller-webhook-cert" not found Apr 20 17:56:22.822576 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:22.822543 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmqlb\" (UniqueName: \"kubernetes.io/projected/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-kube-api-access-mmqlb\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:23.121751 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.121662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" event={"ID":"d4cf93b3-09a3-4d10-997c-5b3c897b6be6","Type":"ContainerStarted","Data":"7b18d1520e44b7e7bf123588c56330f559eba4c44f939a7419ae53dbfb4b1a19"} Apr 20 17:56:23.141400 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.141346 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-554dd5dd7d-xg5d4" podStartSLOduration=2.032736605 podStartE2EDuration="5.141330046s" podCreationTimestamp="2026-04-20 17:56:18 +0000 UTC" firstStartedPulling="2026-04-20 17:56:19.449796779 +0000 UTC m=+498.225327554" lastFinishedPulling="2026-04-20 17:56:22.558390222 +0000 UTC m=+501.333920995" observedRunningTime="2026-04-20 17:56:23.139887546 +0000 UTC m=+501.915418343" watchObservedRunningTime="2026-04-20 17:56:23.141330046 +0000 UTC m=+501.916860840" Apr 20 17:56:23.318787 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.318748 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:23.321318 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.321291 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c45cee7-7c81-4c6f-8550-4943f0ec1b33-cert\") pod \"odh-model-controller-858dbf95b8-lnffx\" (UID: \"5c45cee7-7c81-4c6f-8550-4943f0ec1b33\") " pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:23.517849 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.517804 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:23.638504 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:23.638472 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-lnffx"] Apr 20 17:56:23.641735 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:23.641697 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c45cee7_7c81_4c6f_8550_4943f0ec1b33.slice/crio-ae6b98850a0828d0363af150aa9b15def12039300f36cf6bf67af39e073c6f59 WatchSource:0}: Error finding container ae6b98850a0828d0363af150aa9b15def12039300f36cf6bf67af39e073c6f59: Status 404 returned error can't find the container with id ae6b98850a0828d0363af150aa9b15def12039300f36cf6bf67af39e073c6f59 Apr 20 17:56:24.124968 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:24.124932 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" event={"ID":"5c45cee7-7c81-4c6f-8550-4943f0ec1b33","Type":"ContainerStarted","Data":"ae6b98850a0828d0363af150aa9b15def12039300f36cf6bf67af39e073c6f59"} Apr 20 17:56:27.134157 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:27.134063 2581 generic.go:358] "Generic (PLEG): container finished" podID="5c45cee7-7c81-4c6f-8550-4943f0ec1b33" containerID="9f2aa7d483c0e51936ab794c19f4faadbac203d2314bcd2d933ba051ccc95705" exitCode=1 Apr 20 17:56:27.134157 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:27.134141 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" event={"ID":"5c45cee7-7c81-4c6f-8550-4943f0ec1b33","Type":"ContainerDied","Data":"9f2aa7d483c0e51936ab794c19f4faadbac203d2314bcd2d933ba051ccc95705"} Apr 20 17:56:27.134544 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:27.134393 2581 scope.go:117] "RemoveContainer" containerID="9f2aa7d483c0e51936ab794c19f4faadbac203d2314bcd2d933ba051ccc95705" Apr 20 17:56:28.139070 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.139025 2581 generic.go:358] "Generic (PLEG): container finished" podID="5c45cee7-7c81-4c6f-8550-4943f0ec1b33" containerID="309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b" exitCode=1 Apr 20 17:56:28.139568 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.139108 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" event={"ID":"5c45cee7-7c81-4c6f-8550-4943f0ec1b33","Type":"ContainerDied","Data":"309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b"} Apr 20 17:56:28.139568 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.139159 2581 scope.go:117] "RemoveContainer" containerID="9f2aa7d483c0e51936ab794c19f4faadbac203d2314bcd2d933ba051ccc95705" Apr 20 17:56:28.139568 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.139372 2581 scope.go:117] "RemoveContainer" containerID="309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b" Apr 20 17:56:28.139742 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:28.139592 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-lnffx_opendatahub(5c45cee7-7c81-4c6f-8550-4943f0ec1b33)\"" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" podUID="5c45cee7-7c81-4c6f-8550-4943f0ec1b33" Apr 20 17:56:28.529538 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.529505 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vjnzh"] Apr 20 17:56:28.533616 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.533598 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:28.536027 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.535994 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-hmw2c\"" Apr 20 17:56:28.536173 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.536122 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 17:56:28.540922 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.540899 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vjnzh"] Apr 20 17:56:28.664156 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.664119 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:28.664410 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.664176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s488k\" (UniqueName: \"kubernetes.io/projected/3f6e3122-c4bd-400d-9578-aaf92e911ccf-kube-api-access-s488k\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:28.764673 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.764611 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:28.764841 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.764683 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s488k\" (UniqueName: \"kubernetes.io/projected/3f6e3122-c4bd-400d-9578-aaf92e911ccf-kube-api-access-s488k\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:28.764841 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:28.764703 2581 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 17:56:28.764841 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:28.764756 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert podName:3f6e3122-c4bd-400d-9578-aaf92e911ccf nodeName:}" failed. No retries permitted until 2026-04-20 17:56:29.264740816 +0000 UTC m=+508.040271590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert") pod "kserve-controller-manager-856948b99f-vjnzh" (UID: "3f6e3122-c4bd-400d-9578-aaf92e911ccf") : secret "kserve-webhook-server-cert" not found Apr 20 17:56:28.784136 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:28.784081 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s488k\" (UniqueName: \"kubernetes.io/projected/3f6e3122-c4bd-400d-9578-aaf92e911ccf-kube-api-access-s488k\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:29.143327 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:29.143243 2581 scope.go:117] "RemoveContainer" containerID="309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b" Apr 20 17:56:29.143702 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:29.143426 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-lnffx_opendatahub(5c45cee7-7c81-4c6f-8550-4943f0ec1b33)\"" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" podUID="5c45cee7-7c81-4c6f-8550-4943f0ec1b33" Apr 20 17:56:29.269493 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:29.269451 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:29.269666 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:29.269592 2581 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 17:56:29.269730 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:29.269678 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert podName:3f6e3122-c4bd-400d-9578-aaf92e911ccf nodeName:}" failed. No retries permitted until 2026-04-20 17:56:30.269662169 +0000 UTC m=+509.045192946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert") pod "kserve-controller-manager-856948b99f-vjnzh" (UID: "3f6e3122-c4bd-400d-9578-aaf92e911ccf") : secret "kserve-webhook-server-cert" not found Apr 20 17:56:30.279974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:30.279929 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:30.282387 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:30.282367 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f6e3122-c4bd-400d-9578-aaf92e911ccf-cert\") pod \"kserve-controller-manager-856948b99f-vjnzh\" (UID: \"3f6e3122-c4bd-400d-9578-aaf92e911ccf\") " pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:30.344360 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:30.344315 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:30.492599 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:30.492562 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-vjnzh"] Apr 20 17:56:30.495599 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:30.495571 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6e3122_c4bd_400d_9578_aaf92e911ccf.slice/crio-1a207201b26d532184cb0d9855f0cc70f231127f562cc31649c39834d5daddc6 WatchSource:0}: Error finding container 1a207201b26d532184cb0d9855f0cc70f231127f562cc31649c39834d5daddc6: Status 404 returned error can't find the container with id 1a207201b26d532184cb0d9855f0cc70f231127f562cc31649c39834d5daddc6 Apr 20 17:56:31.150910 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:31.150869 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" event={"ID":"3f6e3122-c4bd-400d-9578-aaf92e911ccf","Type":"ContainerStarted","Data":"1a207201b26d532184cb0d9855f0cc70f231127f562cc31649c39834d5daddc6"} Apr 20 17:56:33.518930 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:33.518833 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:33.519292 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:33.519220 2581 scope.go:117] "RemoveContainer" containerID="309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b" Apr 20 17:56:33.519392 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:56:33.519374 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-lnffx_opendatahub(5c45cee7-7c81-4c6f-8550-4943f0ec1b33)\"" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" podUID="5c45cee7-7c81-4c6f-8550-4943f0ec1b33" Apr 20 17:56:34.116065 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.116030 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm"] Apr 20 17:56:34.119258 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.119238 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.122716 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.122694 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-qrbmb\"" Apr 20 17:56:34.123033 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.123014 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 17:56:34.123178 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.123159 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 17:56:34.131240 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.131206 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm"] Apr 20 17:56:34.162791 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.162754 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" event={"ID":"3f6e3122-c4bd-400d-9578-aaf92e911ccf","Type":"ContainerStarted","Data":"8d2a87965afe0137bb0429bb67bec6da3c314f9582d583bd3c7e8aca1bec6781"} Apr 20 17:56:34.162957 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.162933 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:56:34.186192 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.186139 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" podStartSLOduration=3.505702178 podStartE2EDuration="6.186122065s" podCreationTimestamp="2026-04-20 17:56:28 +0000 UTC" firstStartedPulling="2026-04-20 17:56:30.496952606 +0000 UTC m=+509.272483381" lastFinishedPulling="2026-04-20 17:56:33.177372495 +0000 UTC m=+511.952903268" observedRunningTime="2026-04-20 17:56:34.185326819 +0000 UTC m=+512.960857616" watchObservedRunningTime="2026-04-20 17:56:34.186122065 +0000 UTC m=+512.961652888" Apr 20 17:56:34.208851 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.208806 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d0a0a039-a17c-4011-aebd-9d131231c1af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.209016 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.208865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wvfm\" (UniqueName: \"kubernetes.io/projected/d0a0a039-a17c-4011-aebd-9d131231c1af-kube-api-access-6wvfm\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.309550 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.309511 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d0a0a039-a17c-4011-aebd-9d131231c1af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.309730 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.309561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wvfm\" (UniqueName: \"kubernetes.io/projected/d0a0a039-a17c-4011-aebd-9d131231c1af-kube-api-access-6wvfm\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.312213 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.312192 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d0a0a039-a17c-4011-aebd-9d131231c1af-operator-config\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.319810 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.319776 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wvfm\" (UniqueName: \"kubernetes.io/projected/d0a0a039-a17c-4011-aebd-9d131231c1af-kube-api-access-6wvfm\") pod \"servicemesh-operator3-55f49c5f94-ltqpm\" (UID: \"d0a0a039-a17c-4011-aebd-9d131231c1af\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.428463 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.428420 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:34.556555 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:34.556500 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm"] Apr 20 17:56:34.560789 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:34.560756 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a0a039_a17c_4011_aebd_9d131231c1af.slice/crio-c6cffb46b9e0549dcb36e1460a9e457a781054e97b1976243754a91a8b77e701 WatchSource:0}: Error finding container c6cffb46b9e0549dcb36e1460a9e457a781054e97b1976243754a91a8b77e701: Status 404 returned error can't find the container with id c6cffb46b9e0549dcb36e1460a9e457a781054e97b1976243754a91a8b77e701 Apr 20 17:56:35.170638 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:35.170584 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" event={"ID":"d0a0a039-a17c-4011-aebd-9d131231c1af","Type":"ContainerStarted","Data":"c6cffb46b9e0549dcb36e1460a9e457a781054e97b1976243754a91a8b77e701"} Apr 20 17:56:43.195948 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.195907 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" event={"ID":"d0a0a039-a17c-4011-aebd-9d131231c1af","Type":"ContainerStarted","Data":"7998a488ef4b4684577b4ee29d56263d192258a047947214499244d058d5747e"} Apr 20 17:56:43.196352 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.196039 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:43.217282 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.217225 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" podStartSLOduration=0.950799905 podStartE2EDuration="9.217205402s" podCreationTimestamp="2026-04-20 17:56:34 +0000 UTC" firstStartedPulling="2026-04-20 17:56:34.563292154 +0000 UTC m=+513.338822930" lastFinishedPulling="2026-04-20 17:56:42.829697655 +0000 UTC m=+521.605228427" observedRunningTime="2026-04-20 17:56:43.216709015 +0000 UTC m=+521.992239812" watchObservedRunningTime="2026-04-20 17:56:43.217205402 +0000 UTC m=+521.992736199" Apr 20 17:56:43.518934 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.518841 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:43.519386 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.519369 2581 scope.go:117] "RemoveContainer" containerID="309bc16e4e26b0a6d54683503640d9d17144a1167e6c869147f57afd90d2e07b" Apr 20 17:56:43.872463 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.872431 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb"] Apr 20 17:56:43.875601 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.875575 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.878640 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.878596 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 17:56:43.878996 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.878947 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 17:56:43.878996 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.878976 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 17:56:43.879165 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.879054 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 17:56:43.879291 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.879263 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-57qgt\"" Apr 20 17:56:43.902303 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.902273 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb"] Apr 20 17:56:43.980599 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980599 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980600 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980652 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980673 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6l7h\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-kube-api-access-r6l7h\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980698 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:43.980843 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:43.980740 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082294 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082294 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082325 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082366 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082456 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.082563 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.082482 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6l7h\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-kube-api-access-r6l7h\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.083804 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.083779 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.085874 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.085843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.086052 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.086022 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.086158 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.086066 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.086722 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.086694 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.091269 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.091239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6l7h\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-kube-api-access-r6l7h\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.092206 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.092182 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/56f9588d-d0d6-4172-9969-5f3f8ebaba4b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-76zmb\" (UID: \"56f9588d-d0d6-4172-9969-5f3f8ebaba4b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.185186 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.185140 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:44.201863 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.201752 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" event={"ID":"5c45cee7-7c81-4c6f-8550-4943f0ec1b33","Type":"ContainerStarted","Data":"f2291335cea3013b50a588001eeefa5b4fcaef302796b6c1a4142edd76c3900f"} Apr 20 17:56:44.202494 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.202459 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:56:44.224171 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.224120 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" podStartSLOduration=2.103185612 podStartE2EDuration="22.224102601s" podCreationTimestamp="2026-04-20 17:56:22 +0000 UTC" firstStartedPulling="2026-04-20 17:56:23.643456391 +0000 UTC m=+502.418987165" lastFinishedPulling="2026-04-20 17:56:43.764373367 +0000 UTC m=+522.539904154" observedRunningTime="2026-04-20 17:56:44.223549505 +0000 UTC m=+522.999080301" watchObservedRunningTime="2026-04-20 17:56:44.224102601 +0000 UTC m=+522.999633397" Apr 20 17:56:44.332479 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:44.332394 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb"] Apr 20 17:56:44.335451 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:56:44.335412 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f9588d_d0d6_4172_9969_5f3f8ebaba4b.slice/crio-83c715ed010b9c417911f192d0230f60e9f5876e8f0be606e42b7045d3b6908e WatchSource:0}: Error finding container 83c715ed010b9c417911f192d0230f60e9f5876e8f0be606e42b7045d3b6908e: Status 404 returned error can't find the container with id 83c715ed010b9c417911f192d0230f60e9f5876e8f0be606e42b7045d3b6908e Apr 20 17:56:45.207847 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:45.207737 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" event={"ID":"56f9588d-d0d6-4172-9969-5f3f8ebaba4b","Type":"ContainerStarted","Data":"83c715ed010b9c417911f192d0230f60e9f5876e8f0be606e42b7045d3b6908e"} Apr 20 17:56:47.085461 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:47.085406 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:56:47.085751 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:47.085502 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:56:47.216503 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:47.216465 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" event={"ID":"56f9588d-d0d6-4172-9969-5f3f8ebaba4b","Type":"ContainerStarted","Data":"47fbb6c22d65134830a764fb51e1ff768794a031f7831558948a86ff41675772"} Apr 20 17:56:47.216682 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:47.216613 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:47.240070 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:47.240014 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" podStartSLOduration=1.492259013 podStartE2EDuration="4.239999825s" podCreationTimestamp="2026-04-20 17:56:43 +0000 UTC" firstStartedPulling="2026-04-20 17:56:44.337399455 +0000 UTC m=+523.112930233" lastFinishedPulling="2026-04-20 17:56:47.085140269 +0000 UTC m=+525.860671045" observedRunningTime="2026-04-20 17:56:47.238914844 +0000 UTC m=+526.014445639" watchObservedRunningTime="2026-04-20 17:56:47.239999825 +0000 UTC m=+526.015530619" Apr 20 17:56:48.222409 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:48.222274 2581 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-76zmb container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 17:56:48.222409 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:48.222346 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" podUID="56f9588d-d0d6-4172-9969-5f3f8ebaba4b" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 17:56:51.221696 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:51.221661 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-76zmb" Apr 20 17:56:54.205106 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:54.205078 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-ltqpm" Apr 20 17:56:55.210610 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:56:55.210581 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-lnffx" Apr 20 17:57:05.176117 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:57:05.176085 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-vjnzh" Apr 20 17:58:00.076225 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.076194 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-wcpf7"] Apr 20 17:58:00.079305 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.079284 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:00.081848 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.081823 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 17:58:00.081981 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.081856 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 17:58:00.083071 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.083056 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-ljns4\"" Apr 20 17:58:00.083308 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.083288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggfd\" (UniqueName: \"kubernetes.io/projected/181ed309-e548-4308-9955-e76168b875c6-kube-api-access-5ggfd\") pod \"authorino-operator-657f44b778-wcpf7\" (UID: \"181ed309-e548-4308-9955-e76168b875c6\") " pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:00.088313 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.088279 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-wcpf7"] Apr 20 17:58:00.183682 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.183641 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggfd\" (UniqueName: \"kubernetes.io/projected/181ed309-e548-4308-9955-e76168b875c6-kube-api-access-5ggfd\") pod \"authorino-operator-657f44b778-wcpf7\" (UID: \"181ed309-e548-4308-9955-e76168b875c6\") " pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:00.193687 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.193658 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggfd\" (UniqueName: \"kubernetes.io/projected/181ed309-e548-4308-9955-e76168b875c6-kube-api-access-5ggfd\") pod \"authorino-operator-657f44b778-wcpf7\" (UID: \"181ed309-e548-4308-9955-e76168b875c6\") " pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:00.390784 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.390675 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:00.510650 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:00.510604 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-wcpf7"] Apr 20 17:58:00.513311 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:58:00.513283 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181ed309_e548_4308_9955_e76168b875c6.slice/crio-f50eaab33f255a7e2f1fb05f69ff2763bf8db37426e5ba57142c640452b4107f WatchSource:0}: Error finding container f50eaab33f255a7e2f1fb05f69ff2763bf8db37426e5ba57142c640452b4107f: Status 404 returned error can't find the container with id f50eaab33f255a7e2f1fb05f69ff2763bf8db37426e5ba57142c640452b4107f Apr 20 17:58:01.456784 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:01.456745 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" event={"ID":"181ed309-e548-4308-9955-e76168b875c6","Type":"ContainerStarted","Data":"f50eaab33f255a7e2f1fb05f69ff2763bf8db37426e5ba57142c640452b4107f"} Apr 20 17:58:02.461762 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:02.461658 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" event={"ID":"181ed309-e548-4308-9955-e76168b875c6","Type":"ContainerStarted","Data":"6b7a116e8b8f58072ddc422c13d1153b7566fe5eeb22d81f8f53c61864ea2e07"} Apr 20 17:58:02.462137 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:02.461789 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:02.479961 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:02.479892 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" podStartSLOduration=0.813983299 podStartE2EDuration="2.479872529s" podCreationTimestamp="2026-04-20 17:58:00 +0000 UTC" firstStartedPulling="2026-04-20 17:58:00.515238642 +0000 UTC m=+599.290769419" lastFinishedPulling="2026-04-20 17:58:02.181127876 +0000 UTC m=+600.956658649" observedRunningTime="2026-04-20 17:58:02.478848502 +0000 UTC m=+601.254379322" watchObservedRunningTime="2026-04-20 17:58:02.479872529 +0000 UTC m=+601.255403328" Apr 20 17:58:13.467361 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:13.467285 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-wcpf7" Apr 20 17:58:24.256635 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.256586 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw"] Apr 20 17:58:24.260019 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.260002 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.263189 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.263164 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7bk76\"" Apr 20 17:58:24.272974 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.272950 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw"] Apr 20 17:58:24.365157 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.365118 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrj24\" (UniqueName: \"kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.365337 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.365184 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.466017 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.465980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.466152 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.466043 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrj24\" (UniqueName: \"kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.466385 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.466362 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.488749 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.488714 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrj24\" (UniqueName: \"kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.570398 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.570295 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:24.696259 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.696222 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw"] Apr 20 17:58:24.699796 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:58:24.699768 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd66ce1_a5bb_4b89_b6d3_c0af7e78c063.slice/crio-a438822fdf7946948caa13ee99a64becec8729cf0bece5395d3d33e85639b472 WatchSource:0}: Error finding container a438822fdf7946948caa13ee99a64becec8729cf0bece5395d3d33e85639b472: Status 404 returned error can't find the container with id a438822fdf7946948caa13ee99a64becec8729cf0bece5395d3d33e85639b472 Apr 20 17:58:24.954647 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.954547 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw"] Apr 20 17:58:24.965901 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.965823 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw"] Apr 20 17:58:24.977830 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.977799 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:24.982333 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.982309 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:24.991455 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:24.991375 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:25.070494 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.070457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.070679 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.070518 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq6kt\" (UniqueName: \"kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.171229 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.171187 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.171429 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.171249 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq6kt\" (UniqueName: \"kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.171698 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.171675 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.180824 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.180801 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq6kt\" (UniqueName: \"kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6n65m\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.297902 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.297818 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:25.426921 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.426885 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:25.430376 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:58:25.430351 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1c6d55e_86d3_4aa9_9623_28e5469ebe27.slice/crio-bcbec38b6bb22e32bcafcd41ae2257d009c023d02a0c823d7cd882b24d825434 WatchSource:0}: Error finding container bcbec38b6bb22e32bcafcd41ae2257d009c023d02a0c823d7cd882b24d825434: Status 404 returned error can't find the container with id bcbec38b6bb22e32bcafcd41ae2257d009c023d02a0c823d7cd882b24d825434 Apr 20 17:58:25.534074 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:25.534033 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" event={"ID":"f1c6d55e-86d3-4aa9-9623-28e5469ebe27","Type":"ContainerStarted","Data":"bcbec38b6bb22e32bcafcd41ae2257d009c023d02a0c823d7cd882b24d825434"} Apr 20 17:58:29.550132 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.550093 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" event={"ID":"f1c6d55e-86d3-4aa9-9623-28e5469ebe27","Type":"ContainerStarted","Data":"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2"} Apr 20 17:58:29.550650 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.550180 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:29.551570 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.551547 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" containerName="manager" containerID="cri-o://9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29" gracePeriod=2 Apr 20 17:58:29.577477 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.577429 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" podStartSLOduration=1.951089564 podStartE2EDuration="5.57741013s" podCreationTimestamp="2026-04-20 17:58:24 +0000 UTC" firstStartedPulling="2026-04-20 17:58:25.432755727 +0000 UTC m=+624.208286500" lastFinishedPulling="2026-04-20 17:58:29.059076293 +0000 UTC m=+627.834607066" observedRunningTime="2026-04-20 17:58:29.569901381 +0000 UTC m=+628.345432188" watchObservedRunningTime="2026-04-20 17:58:29.57741013 +0000 UTC m=+628.352940926" Apr 20 17:58:29.787545 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.787522 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:29.920605 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.920577 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume\") pod \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " Apr 20 17:58:29.920825 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.920684 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrj24\" (UniqueName: \"kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24\") pod \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\" (UID: \"1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063\") " Apr 20 17:58:29.920902 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.920875 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" (UID: "1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:58:29.922795 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:29.922766 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24" (OuterVolumeSpecName: "kube-api-access-mrj24") pod "1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" (UID: "1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063"). InnerVolumeSpecName "kube-api-access-mrj24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:30.021651 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.021584 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-extensions-socket-volume\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:58:30.021651 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.021641 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrj24\" (UniqueName: \"kubernetes.io/projected/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063-kube-api-access-mrj24\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:58:30.556696 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.556660 2581 generic.go:358] "Generic (PLEG): container finished" podID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" containerID="9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29" exitCode=2 Apr 20 17:58:30.557074 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.556762 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" Apr 20 17:58:30.557074 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.556807 2581 scope.go:117] "RemoveContainer" containerID="9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29" Apr 20 17:58:30.559208 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.559175 2581 status_manager.go:895] "Failed to get status for pod" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" is forbidden: User \"system:node:ip-10-0-138-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-9.ec2.internal' and this object" Apr 20 17:58:30.565773 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.565749 2581 scope.go:117] "RemoveContainer" containerID="9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29" Apr 20 17:58:30.566054 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:58:30.566030 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29\": container with ID starting with 9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29 not found: ID does not exist" containerID="9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29" Apr 20 17:58:30.566134 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.566061 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29"} err="failed to get container status \"9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29\": rpc error: code = NotFound desc = could not find container \"9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29\": container with ID starting with 9a66101d8a001ebdfe7c330fe3fdc6deffdaa5eaa9276e0c3eb6ddc09c6e1f29 not found: ID does not exist" Apr 20 17:58:30.567758 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:30.567736 2581 status_manager.go:895] "Failed to get status for pod" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" is forbidden: User \"system:node:ip-10-0-138-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-9.ec2.internal' and this object" Apr 20 17:58:31.765764 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:31.765715 2581 status_manager.go:895] "Failed to get status for pod" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-vg7gw\" is forbidden: User \"system:node:ip-10-0-138-9.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-9.ec2.internal' and this object" Apr 20 17:58:31.766373 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:31.766351 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" path="/var/lib/kubelet/pods/1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063/volumes" Apr 20 17:58:40.559588 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:40.559551 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:52.576987 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.576952 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:52.577584 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.577269 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" podUID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" containerName="manager" containerID="cri-o://ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2" gracePeriod=10 Apr 20 17:58:52.833544 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.833473 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:52.907007 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.906965 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq6kt\" (UniqueName: \"kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt\") pod \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " Apr 20 17:58:52.907209 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.907026 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume\") pod \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\" (UID: \"f1c6d55e-86d3-4aa9-9623-28e5469ebe27\") " Apr 20 17:58:52.907589 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.907558 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f1c6d55e-86d3-4aa9-9623-28e5469ebe27" (UID: "f1c6d55e-86d3-4aa9-9623-28e5469ebe27"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:58:52.909123 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:52.909100 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt" (OuterVolumeSpecName: "kube-api-access-bq6kt") pod "f1c6d55e-86d3-4aa9-9623-28e5469ebe27" (UID: "f1c6d55e-86d3-4aa9-9623-28e5469ebe27"). InnerVolumeSpecName "kube-api-access-bq6kt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:53.007794 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.007756 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq6kt\" (UniqueName: \"kubernetes.io/projected/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-kube-api-access-bq6kt\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:58:53.007794 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.007786 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1c6d55e-86d3-4aa9-9623-28e5469ebe27-extensions-socket-volume\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:58:53.634522 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.634485 2581 generic.go:358] "Generic (PLEG): container finished" podID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" containerID="ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2" exitCode=0 Apr 20 17:58:53.634971 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.634541 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" Apr 20 17:58:53.634971 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.634567 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" event={"ID":"f1c6d55e-86d3-4aa9-9623-28e5469ebe27","Type":"ContainerDied","Data":"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2"} Apr 20 17:58:53.634971 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.634601 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m" event={"ID":"f1c6d55e-86d3-4aa9-9623-28e5469ebe27","Type":"ContainerDied","Data":"bcbec38b6bb22e32bcafcd41ae2257d009c023d02a0c823d7cd882b24d825434"} Apr 20 17:58:53.634971 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.634617 2581 scope.go:117] "RemoveContainer" containerID="ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2" Apr 20 17:58:53.642615 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.642595 2581 scope.go:117] "RemoveContainer" containerID="ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2" Apr 20 17:58:53.642929 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:58:53.642904 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2\": container with ID starting with ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2 not found: ID does not exist" containerID="ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2" Apr 20 17:58:53.642991 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.642938 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2"} err="failed to get container status \"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2\": rpc error: code = NotFound desc = could not find container \"ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2\": container with ID starting with ec42a7b7e2852c91a2e8ccd938cbca742efe33a756b0d5448a085618cd2a79f2 not found: ID does not exist" Apr 20 17:58:53.656035 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.655998 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:53.662109 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.662081 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6n65m"] Apr 20 17:58:53.766348 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:53.766312 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" path="/var/lib/kubelet/pods/f1c6d55e-86d3-4aa9-9623-28e5469ebe27/volumes" Apr 20 17:58:56.834044 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834002 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g"] Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834318 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" containerName="manager" Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834349 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" containerName="manager" Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834386 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" containerName="manager" Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834393 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" containerName="manager" Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834478 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1c6d55e-86d3-4aa9-9623-28e5469ebe27" containerName="manager" Apr 20 17:58:56.834501 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.834492 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1dd66ce1-a5bb-4b89-b6d3-c0af7e78c063" containerName="manager" Apr 20 17:58:56.839824 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.839796 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.843106 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.843079 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-4kq2k\"" Apr 20 17:58:56.851698 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.851668 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g"] Apr 20 17:58:56.941326 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941268 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941356 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941400 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941427 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941473 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941502 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941531 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941530 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdzc\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-kube-api-access-dvdzc\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941861 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941567 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:56.941861 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:56.941590 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.042907 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.042858 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdzc\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-kube-api-access-dvdzc\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.042912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.042941 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043024 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043134 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043174 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043537 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043202 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043687 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043658 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043766 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043697 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043766 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043749 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043876 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043852 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.043927 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.043872 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.045492 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.045459 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.045586 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.045561 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.051392 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.051369 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.051619 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.051589 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdzc\" (UniqueName: \"kubernetes.io/projected/0a59f0c1-7444-4dd6-a891-f19c6ffef70c-kube-api-access-dvdzc\") pod \"maas-default-gateway-openshift-default-58b6f876-mwx4g\" (UID: \"0a59f0c1-7444-4dd6-a891-f19c6ffef70c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.155569 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.155532 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:58:57.284395 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.284317 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g"] Apr 20 17:58:57.286714 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:58:57.286686 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a59f0c1_7444_4dd6_a891_f19c6ffef70c.slice/crio-b226d7efe00c7fa5972049dd1add66d64e6f10a6d8a3e0a253e4a3b34af0349f WatchSource:0}: Error finding container b226d7efe00c7fa5972049dd1add66d64e6f10a6d8a3e0a253e4a3b34af0349f: Status 404 returned error can't find the container with id b226d7efe00c7fa5972049dd1add66d64e6f10a6d8a3e0a253e4a3b34af0349f Apr 20 17:58:57.648462 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:57.648378 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" event={"ID":"0a59f0c1-7444-4dd6-a891-f19c6ffef70c","Type":"ContainerStarted","Data":"b226d7efe00c7fa5972049dd1add66d64e6f10a6d8a3e0a253e4a3b34af0349f"} Apr 20 17:58:59.767798 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:59.767761 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:58:59.768077 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:59.767843 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:58:59.768077 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:58:59.767873 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:59:00.660210 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:00.660176 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" event={"ID":"0a59f0c1-7444-4dd6-a891-f19c6ffef70c","Type":"ContainerStarted","Data":"072858465d446cdb7437e1e32828e0c56770a05d3e9f555d103199345c906379"} Apr 20 17:59:00.679984 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:00.679905 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" podStartSLOduration=2.201296686 podStartE2EDuration="4.679889477s" podCreationTimestamp="2026-04-20 17:58:56 +0000 UTC" firstStartedPulling="2026-04-20 17:58:57.288919232 +0000 UTC m=+656.064450005" lastFinishedPulling="2026-04-20 17:58:59.767512022 +0000 UTC m=+658.543042796" observedRunningTime="2026-04-20 17:59:00.678664597 +0000 UTC m=+659.454195390" watchObservedRunningTime="2026-04-20 17:59:00.679889477 +0000 UTC m=+659.455420272" Apr 20 17:59:01.156712 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:01.156671 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:59:01.161282 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:01.161259 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:59:01.663881 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:01.663855 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:59:01.664992 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:01.664972 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-mwx4g" Apr 20 17:59:10.532986 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.532947 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:10.537662 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.537614 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.540386 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.540364 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tbdkf\"" Apr 20 17:59:10.540524 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.540408 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 17:59:10.543454 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.543430 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:10.552434 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.552406 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.552586 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.552453 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzwg\" (UniqueName: \"kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.628448 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.628413 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:10.653645 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.653585 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.653825 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.653763 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzwg\" (UniqueName: \"kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.654249 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.654230 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.661791 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.661765 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzwg\" (UniqueName: \"kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg\") pod \"limitador-limitador-7d549b5b-q7t4d\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.849277 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.849184 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:10.976487 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:10.976453 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:10.980557 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:59:10.980519 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1bd5e1_e767_480d_b3a6_218823065f08.slice/crio-7919c7e805300ac6ed8d70ed2d930bf1730d9711e71742059a7599e3fda453bd WatchSource:0}: Error finding container 7919c7e805300ac6ed8d70ed2d930bf1730d9711e71742059a7599e3fda453bd: Status 404 returned error can't find the container with id 7919c7e805300ac6ed8d70ed2d930bf1730d9711e71742059a7599e3fda453bd Apr 20 17:59:11.698800 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:11.698757 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" event={"ID":"5b1bd5e1-e767-480d-b3a6-218823065f08","Type":"ContainerStarted","Data":"7919c7e805300ac6ed8d70ed2d930bf1730d9711e71742059a7599e3fda453bd"} Apr 20 17:59:13.706987 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:13.706942 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" event={"ID":"5b1bd5e1-e767-480d-b3a6-218823065f08","Type":"ContainerStarted","Data":"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c"} Apr 20 17:59:13.707373 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:13.707066 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:13.727718 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:13.727661 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" podStartSLOduration=1.096491393 podStartE2EDuration="3.727646508s" podCreationTimestamp="2026-04-20 17:59:10 +0000 UTC" firstStartedPulling="2026-04-20 17:59:10.982705843 +0000 UTC m=+669.758236615" lastFinishedPulling="2026-04-20 17:59:13.613860954 +0000 UTC m=+672.389391730" observedRunningTime="2026-04-20 17:59:13.725850229 +0000 UTC m=+672.501381021" watchObservedRunningTime="2026-04-20 17:59:13.727646508 +0000 UTC m=+672.503177341" Apr 20 17:59:24.711796 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:24.711760 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:25.672572 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:25.672534 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:25.672793 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:25.672770 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" podUID="5b1bd5e1-e767-480d-b3a6-218823065f08" containerName="limitador" containerID="cri-o://7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c" gracePeriod=30 Apr 20 17:59:26.218828 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.218800 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:26.290969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.290866 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzwg\" (UniqueName: \"kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg\") pod \"5b1bd5e1-e767-480d-b3a6-218823065f08\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " Apr 20 17:59:26.290969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.290929 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file\") pod \"5b1bd5e1-e767-480d-b3a6-218823065f08\" (UID: \"5b1bd5e1-e767-480d-b3a6-218823065f08\") " Apr 20 17:59:26.291336 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.291308 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file" (OuterVolumeSpecName: "config-file") pod "5b1bd5e1-e767-480d-b3a6-218823065f08" (UID: "5b1bd5e1-e767-480d-b3a6-218823065f08"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:59:26.293058 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.293029 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg" (OuterVolumeSpecName: "kube-api-access-4zzwg") pod "5b1bd5e1-e767-480d-b3a6-218823065f08" (UID: "5b1bd5e1-e767-480d-b3a6-218823065f08"). InnerVolumeSpecName "kube-api-access-4zzwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:26.391864 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.391819 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zzwg\" (UniqueName: \"kubernetes.io/projected/5b1bd5e1-e767-480d-b3a6-218823065f08-kube-api-access-4zzwg\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:59:26.391864 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.391853 2581 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b1bd5e1-e767-480d-b3a6-218823065f08-config-file\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:59:26.750517 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.750478 2581 generic.go:358] "Generic (PLEG): container finished" podID="5b1bd5e1-e767-480d-b3a6-218823065f08" containerID="7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c" exitCode=0 Apr 20 17:59:26.750717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.750575 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" Apr 20 17:59:26.750717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.750567 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" event={"ID":"5b1bd5e1-e767-480d-b3a6-218823065f08","Type":"ContainerDied","Data":"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c"} Apr 20 17:59:26.750717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.750693 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7t4d" event={"ID":"5b1bd5e1-e767-480d-b3a6-218823065f08","Type":"ContainerDied","Data":"7919c7e805300ac6ed8d70ed2d930bf1730d9711e71742059a7599e3fda453bd"} Apr 20 17:59:26.750717 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.750713 2581 scope.go:117] "RemoveContainer" containerID="7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c" Apr 20 17:59:26.758637 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.758604 2581 scope.go:117] "RemoveContainer" containerID="7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c" Apr 20 17:59:26.758913 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:59:26.758887 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c\": container with ID starting with 7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c not found: ID does not exist" containerID="7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c" Apr 20 17:59:26.759013 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.758926 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c"} err="failed to get container status \"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c\": rpc error: code = NotFound desc = could not find container \"7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c\": container with ID starting with 7668803b0264b78a0ebea2a7428d96531ca5a7ca7f9cc828d5c17c70efbc5a0c not found: ID does not exist" Apr 20 17:59:26.770914 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.770887 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:26.774667 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:26.774619 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7t4d"] Apr 20 17:59:27.291593 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.291558 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-z626m"] Apr 20 17:59:27.292188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.291894 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b1bd5e1-e767-480d-b3a6-218823065f08" containerName="limitador" Apr 20 17:59:27.292188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.291905 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1bd5e1-e767-480d-b3a6-218823065f08" containerName="limitador" Apr 20 17:59:27.292188 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.291965 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b1bd5e1-e767-480d-b3a6-218823065f08" containerName="limitador" Apr 20 17:59:27.296436 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.296409 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.298919 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.298896 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 17:59:27.299062 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.298902 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-jmpwh\"" Apr 20 17:59:27.302397 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.302367 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-z626m"] Apr 20 17:59:27.400194 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.400160 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f82s\" (UniqueName: \"kubernetes.io/projected/51daf027-c1b9-465f-8f03-d53c5ba63343-kube-api-access-9f82s\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.400394 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.400345 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51daf027-c1b9-465f-8f03-d53c5ba63343-data\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.501539 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.501495 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51daf027-c1b9-465f-8f03-d53c5ba63343-data\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.501720 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.501553 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f82s\" (UniqueName: \"kubernetes.io/projected/51daf027-c1b9-465f-8f03-d53c5ba63343-kube-api-access-9f82s\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.501950 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.501929 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51daf027-c1b9-465f-8f03-d53c5ba63343-data\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.509956 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.509926 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f82s\" (UniqueName: \"kubernetes.io/projected/51daf027-c1b9-465f-8f03-d53c5ba63343-kube-api-access-9f82s\") pod \"postgres-868db5846d-z626m\" (UID: \"51daf027-c1b9-465f-8f03-d53c5ba63343\") " pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.608273 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.608170 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:27.728504 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.728471 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-z626m"] Apr 20 17:59:27.731602 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:59:27.731575 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51daf027_c1b9_465f_8f03_d53c5ba63343.slice/crio-93cf7005c8a48e1a5b59dbeb7862928d414dfc2111b8c86955f7be92ba98eaba WatchSource:0}: Error finding container 93cf7005c8a48e1a5b59dbeb7862928d414dfc2111b8c86955f7be92ba98eaba: Status 404 returned error can't find the container with id 93cf7005c8a48e1a5b59dbeb7862928d414dfc2111b8c86955f7be92ba98eaba Apr 20 17:59:27.756076 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.756039 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-z626m" event={"ID":"51daf027-c1b9-465f-8f03-d53c5ba63343","Type":"ContainerStarted","Data":"93cf7005c8a48e1a5b59dbeb7862928d414dfc2111b8c86955f7be92ba98eaba"} Apr 20 17:59:27.765482 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:27.765455 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1bd5e1-e767-480d-b3a6-218823065f08" path="/var/lib/kubelet/pods/5b1bd5e1-e767-480d-b3a6-218823065f08/volumes" Apr 20 17:59:32.775170 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:32.775129 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-z626m" event={"ID":"51daf027-c1b9-465f-8f03-d53c5ba63343","Type":"ContainerStarted","Data":"dd1f3c2c35c5622a3cf5348e0ae578a96332ef5eac23a8bcf4498d096983c228"} Apr 20 17:59:32.775545 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:32.775243 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:32.792758 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:32.792708 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-z626m" podStartSLOduration=0.957659657 podStartE2EDuration="5.79269094s" podCreationTimestamp="2026-04-20 17:59:27 +0000 UTC" firstStartedPulling="2026-04-20 17:59:27.733224105 +0000 UTC m=+686.508754878" lastFinishedPulling="2026-04-20 17:59:32.56825537 +0000 UTC m=+691.343786161" observedRunningTime="2026-04-20 17:59:32.791173812 +0000 UTC m=+691.566704609" watchObservedRunningTime="2026-04-20 17:59:32.79269094 +0000 UTC m=+691.568221736" Apr 20 17:59:38.808130 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:38.808056 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-z626m" Apr 20 17:59:42.172235 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.172198 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:42.175578 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.175557 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:42.178350 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.178328 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-c858h\"" Apr 20 17:59:42.186834 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.186813 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:42.311428 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.311391 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 17:59:42.314681 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.314657 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:42.322618 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.322592 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 17:59:42.338770 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.338741 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh588\" (UniqueName: \"kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588\") pod \"maas-controller-668fb78c9c-drr6n\" (UID: \"a98eedbd-4d54-4489-8fcd-5a85e57c70ce\") " pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:42.440153 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.440052 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh588\" (UniqueName: \"kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588\") pod \"maas-controller-668fb78c9c-drr6n\" (UID: \"a98eedbd-4d54-4489-8fcd-5a85e57c70ce\") " pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:42.440153 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.440117 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnx2r\" (UniqueName: \"kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r\") pod \"maas-controller-6cc5b84798-wzjqh\" (UID: \"6914e2d9-0b50-4ff7-97a5-b474b987ae0b\") " pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:42.448256 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.448225 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh588\" (UniqueName: \"kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588\") pod \"maas-controller-668fb78c9c-drr6n\" (UID: \"a98eedbd-4d54-4489-8fcd-5a85e57c70ce\") " pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:42.486140 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.486103 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:42.541265 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.541215 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnx2r\" (UniqueName: \"kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r\") pod \"maas-controller-6cc5b84798-wzjqh\" (UID: \"6914e2d9-0b50-4ff7-97a5-b474b987ae0b\") " pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:42.550070 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.550017 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnx2r\" (UniqueName: \"kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r\") pod \"maas-controller-6cc5b84798-wzjqh\" (UID: \"6914e2d9-0b50-4ff7-97a5-b474b987ae0b\") " pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:42.625983 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.625943 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:42.629574 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.629434 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:42.635067 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:59:42.635027 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda98eedbd_4d54_4489_8fcd_5a85e57c70ce.slice/crio-e6aa9faf87acdaecf5b66fc46020ace7cb244681d53553679044fc982a483769 WatchSource:0}: Error finding container e6aa9faf87acdaecf5b66fc46020ace7cb244681d53553679044fc982a483769: Status 404 returned error can't find the container with id e6aa9faf87acdaecf5b66fc46020ace7cb244681d53553679044fc982a483769 Apr 20 17:59:42.753475 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.753411 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 17:59:42.755763 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:59:42.755737 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6914e2d9_0b50_4ff7_97a5_b474b987ae0b.slice/crio-852e5734b01e781bdf6b2333ae0c3edb55834e0b41391cd5ef207df0eca06e57 WatchSource:0}: Error finding container 852e5734b01e781bdf6b2333ae0c3edb55834e0b41391cd5ef207df0eca06e57: Status 404 returned error can't find the container with id 852e5734b01e781bdf6b2333ae0c3edb55834e0b41391cd5ef207df0eca06e57 Apr 20 17:59:42.808173 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.808137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" event={"ID":"6914e2d9-0b50-4ff7-97a5-b474b987ae0b","Type":"ContainerStarted","Data":"852e5734b01e781bdf6b2333ae0c3edb55834e0b41391cd5ef207df0eca06e57"} Apr 20 17:59:42.809110 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:42.809085 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-668fb78c9c-drr6n" event={"ID":"a98eedbd-4d54-4489-8fcd-5a85e57c70ce","Type":"ContainerStarted","Data":"e6aa9faf87acdaecf5b66fc46020ace7cb244681d53553679044fc982a483769"} Apr 20 17:59:45.823762 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.823725 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" event={"ID":"6914e2d9-0b50-4ff7-97a5-b474b987ae0b","Type":"ContainerStarted","Data":"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8"} Apr 20 17:59:45.824226 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.823847 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:45.824882 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.824861 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-668fb78c9c-drr6n" event={"ID":"a98eedbd-4d54-4489-8fcd-5a85e57c70ce","Type":"ContainerStarted","Data":"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c"} Apr 20 17:59:45.824989 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.824975 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:45.840678 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.840608 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" podStartSLOduration=1.4498631419999999 podStartE2EDuration="3.84059241s" podCreationTimestamp="2026-04-20 17:59:42 +0000 UTC" firstStartedPulling="2026-04-20 17:59:42.75712279 +0000 UTC m=+701.532653562" lastFinishedPulling="2026-04-20 17:59:45.147852054 +0000 UTC m=+703.923382830" observedRunningTime="2026-04-20 17:59:45.839729096 +0000 UTC m=+704.615259893" watchObservedRunningTime="2026-04-20 17:59:45.84059241 +0000 UTC m=+704.616123208" Apr 20 17:59:45.854613 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:45.854562 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-668fb78c9c-drr6n" podStartSLOduration=1.332767702 podStartE2EDuration="3.854547313s" podCreationTimestamp="2026-04-20 17:59:42 +0000 UTC" firstStartedPulling="2026-04-20 17:59:42.636750384 +0000 UTC m=+701.412281169" lastFinishedPulling="2026-04-20 17:59:45.158529995 +0000 UTC m=+703.934060780" observedRunningTime="2026-04-20 17:59:45.853700827 +0000 UTC m=+704.629231624" watchObservedRunningTime="2026-04-20 17:59:45.854547313 +0000 UTC m=+704.630078108" Apr 20 17:59:56.832511 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:56.832477 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 17:59:56.833043 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:56.832875 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:56.889198 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:56.889165 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:56.889415 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:56.889393 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-668fb78c9c-drr6n" podUID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" containerName="manager" containerID="cri-o://4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c" gracePeriod=10 Apr 20 17:59:57.130697 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.130675 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:57.159607 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.159582 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh588\" (UniqueName: \"kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588\") pod \"a98eedbd-4d54-4489-8fcd-5a85e57c70ce\" (UID: \"a98eedbd-4d54-4489-8fcd-5a85e57c70ce\") " Apr 20 17:59:57.161746 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.161709 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588" (OuterVolumeSpecName: "kube-api-access-gh588") pod "a98eedbd-4d54-4489-8fcd-5a85e57c70ce" (UID: "a98eedbd-4d54-4489-8fcd-5a85e57c70ce"). InnerVolumeSpecName "kube-api-access-gh588". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:59:57.183903 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.183873 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f97df6549-xk7cs"] Apr 20 17:59:57.184175 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.184163 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" containerName="manager" Apr 20 17:59:57.184216 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.184177 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" containerName="manager" Apr 20 17:59:57.184248 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.184243 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" containerName="manager" Apr 20 17:59:57.187221 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.187204 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:57.195571 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.195540 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f97df6549-xk7cs"] Apr 20 17:59:57.260646 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.260587 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2qf\" (UniqueName: \"kubernetes.io/projected/6a347933-69be-4214-94a8-781636e2d17b-kube-api-access-kw2qf\") pod \"maas-controller-7f97df6549-xk7cs\" (UID: \"6a347933-69be-4214-94a8-781636e2d17b\") " pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:57.260819 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.260727 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gh588\" (UniqueName: \"kubernetes.io/projected/a98eedbd-4d54-4489-8fcd-5a85e57c70ce-kube-api-access-gh588\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 17:59:57.362107 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.362049 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2qf\" (UniqueName: \"kubernetes.io/projected/6a347933-69be-4214-94a8-781636e2d17b-kube-api-access-kw2qf\") pod \"maas-controller-7f97df6549-xk7cs\" (UID: \"6a347933-69be-4214-94a8-781636e2d17b\") " pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:57.370751 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.370675 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2qf\" (UniqueName: \"kubernetes.io/projected/6a347933-69be-4214-94a8-781636e2d17b-kube-api-access-kw2qf\") pod \"maas-controller-7f97df6549-xk7cs\" (UID: \"6a347933-69be-4214-94a8-781636e2d17b\") " pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:57.498535 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.498490 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:57.616333 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.616178 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f97df6549-xk7cs"] Apr 20 17:59:57.618987 ip-10-0-138-9 kubenswrapper[2581]: W0420 17:59:57.618963 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a347933_69be_4214_94a8_781636e2d17b.slice/crio-cc66726256a35ef555e7993bdf038f3c2cb4f9c5f824a37a12b5cbc1d5c84c80 WatchSource:0}: Error finding container cc66726256a35ef555e7993bdf038f3c2cb4f9c5f824a37a12b5cbc1d5c84c80: Status 404 returned error can't find the container with id cc66726256a35ef555e7993bdf038f3c2cb4f9c5f824a37a12b5cbc1d5c84c80 Apr 20 17:59:57.867450 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.867417 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f97df6549-xk7cs" event={"ID":"6a347933-69be-4214-94a8-781636e2d17b","Type":"ContainerStarted","Data":"cc66726256a35ef555e7993bdf038f3c2cb4f9c5f824a37a12b5cbc1d5c84c80"} Apr 20 17:59:57.868536 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.868512 2581 generic.go:358] "Generic (PLEG): container finished" podID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" containerID="4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c" exitCode=0 Apr 20 17:59:57.868668 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.868576 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-668fb78c9c-drr6n" Apr 20 17:59:57.868668 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.868596 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-668fb78c9c-drr6n" event={"ID":"a98eedbd-4d54-4489-8fcd-5a85e57c70ce","Type":"ContainerDied","Data":"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c"} Apr 20 17:59:57.868668 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.868658 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-668fb78c9c-drr6n" event={"ID":"a98eedbd-4d54-4489-8fcd-5a85e57c70ce","Type":"ContainerDied","Data":"e6aa9faf87acdaecf5b66fc46020ace7cb244681d53553679044fc982a483769"} Apr 20 17:59:57.868778 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.868677 2581 scope.go:117] "RemoveContainer" containerID="4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c" Apr 20 17:59:57.876180 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.876163 2581 scope.go:117] "RemoveContainer" containerID="4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c" Apr 20 17:59:57.876474 ip-10-0-138-9 kubenswrapper[2581]: E0420 17:59:57.876452 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c\": container with ID starting with 4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c not found: ID does not exist" containerID="4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c" Apr 20 17:59:57.876564 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.876490 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c"} err="failed to get container status \"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c\": rpc error: code = NotFound desc = could not find container \"4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c\": container with ID starting with 4c2ce2b85c16180d0c334f46e8a2b2f4286e2c348494a8c948f6805b0c83ac1c not found: ID does not exist" Apr 20 17:59:57.885927 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.885899 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:57.889101 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:57.889076 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-668fb78c9c-drr6n"] Apr 20 17:59:58.873500 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:58.873464 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f97df6549-xk7cs" event={"ID":"6a347933-69be-4214-94a8-781636e2d17b","Type":"ContainerStarted","Data":"41851667e9fcd0b7ef8a94efbdcdb719e0e3185a8495e8f13281831ae939cfff"} Apr 20 17:59:58.873969 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:58.873562 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 17:59:58.891636 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:58.891572 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f97df6549-xk7cs" podStartSLOduration=1.5562831240000001 podStartE2EDuration="1.891556392s" podCreationTimestamp="2026-04-20 17:59:57 +0000 UTC" firstStartedPulling="2026-04-20 17:59:57.620210706 +0000 UTC m=+716.395741479" lastFinishedPulling="2026-04-20 17:59:57.955483966 +0000 UTC m=+716.731014747" observedRunningTime="2026-04-20 17:59:58.89003262 +0000 UTC m=+717.665563414" watchObservedRunningTime="2026-04-20 17:59:58.891556392 +0000 UTC m=+717.667087186" Apr 20 17:59:59.766659 ip-10-0-138-9 kubenswrapper[2581]: I0420 17:59:59.766606 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98eedbd-4d54-4489-8fcd-5a85e57c70ce" path="/var/lib/kubelet/pods/a98eedbd-4d54-4489-8fcd-5a85e57c70ce/volumes" Apr 20 18:00:05.195539 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.195501 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:05.210820 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.210787 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:05.211006 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.210901 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.213979 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.213942 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 18:00:05.214120 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.213981 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xj288\"" Apr 20 18:00:05.214120 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.214028 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 18:00:05.328213 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.328163 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpcvs\" (UniqueName: \"kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.328392 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.328217 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.429495 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.429453 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.429674 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.429549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpcvs\" (UniqueName: \"kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.432335 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.432304 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.438375 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.438346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpcvs\" (UniqueName: \"kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs\") pod \"maas-api-78db588b5d-2q6lj\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.523051 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.522954 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:05.650648 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.650589 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:05.653552 ip-10-0-138-9 kubenswrapper[2581]: W0420 18:00:05.653521 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddffcea60_e788_410c_9a83_0f4870b898cf.slice/crio-427a8f0d693ce7fcb48c603f77b03fbe75df708abc528e0366dd136855a17f86 WatchSource:0}: Error finding container 427a8f0d693ce7fcb48c603f77b03fbe75df708abc528e0366dd136855a17f86: Status 404 returned error can't find the container with id 427a8f0d693ce7fcb48c603f77b03fbe75df708abc528e0366dd136855a17f86 Apr 20 18:00:05.898860 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:05.898771 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-78db588b5d-2q6lj" event={"ID":"dffcea60-e788-410c-9a83-0f4870b898cf","Type":"ContainerStarted","Data":"427a8f0d693ce7fcb48c603f77b03fbe75df708abc528e0366dd136855a17f86"} Apr 20 18:00:07.910291 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:07.910250 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-78db588b5d-2q6lj" event={"ID":"dffcea60-e788-410c-9a83-0f4870b898cf","Type":"ContainerStarted","Data":"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d"} Apr 20 18:00:07.910798 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:07.910448 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:07.929579 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:07.929529 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-78db588b5d-2q6lj" podStartSLOduration=1.134785159 podStartE2EDuration="2.929513716s" podCreationTimestamp="2026-04-20 18:00:05 +0000 UTC" firstStartedPulling="2026-04-20 18:00:05.655104491 +0000 UTC m=+724.430635268" lastFinishedPulling="2026-04-20 18:00:07.449833052 +0000 UTC m=+726.225363825" observedRunningTime="2026-04-20 18:00:07.92745136 +0000 UTC m=+726.702982153" watchObservedRunningTime="2026-04-20 18:00:07.929513716 +0000 UTC m=+726.705044551" Apr 20 18:00:09.884593 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:09.884567 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f97df6549-xk7cs" Apr 20 18:00:09.923468 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:09.923434 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 18:00:09.923687 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:09.923664 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" podUID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" containerName="manager" containerID="cri-o://0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8" gracePeriod=10 Apr 20 18:00:10.175430 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.175401 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 18:00:10.271832 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.271795 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnx2r\" (UniqueName: \"kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r\") pod \"6914e2d9-0b50-4ff7-97a5-b474b987ae0b\" (UID: \"6914e2d9-0b50-4ff7-97a5-b474b987ae0b\") " Apr 20 18:00:10.273879 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.273848 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r" (OuterVolumeSpecName: "kube-api-access-xnx2r") pod "6914e2d9-0b50-4ff7-97a5-b474b987ae0b" (UID: "6914e2d9-0b50-4ff7-97a5-b474b987ae0b"). InnerVolumeSpecName "kube-api-access-xnx2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 18:00:10.372907 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.372870 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnx2r\" (UniqueName: \"kubernetes.io/projected/6914e2d9-0b50-4ff7-97a5-b474b987ae0b-kube-api-access-xnx2r\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 18:00:10.921555 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.921510 2581 generic.go:358] "Generic (PLEG): container finished" podID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" containerID="0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8" exitCode=0 Apr 20 18:00:10.922000 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.921588 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" event={"ID":"6914e2d9-0b50-4ff7-97a5-b474b987ae0b","Type":"ContainerDied","Data":"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8"} Apr 20 18:00:10.922000 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.921651 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" event={"ID":"6914e2d9-0b50-4ff7-97a5-b474b987ae0b","Type":"ContainerDied","Data":"852e5734b01e781bdf6b2333ae0c3edb55834e0b41391cd5ef207df0eca06e57"} Apr 20 18:00:10.922000 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.921668 2581 scope.go:117] "RemoveContainer" containerID="0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8" Apr 20 18:00:10.922000 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.921605 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6cc5b84798-wzjqh" Apr 20 18:00:10.930478 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.930290 2581 scope.go:117] "RemoveContainer" containerID="0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8" Apr 20 18:00:10.930597 ip-10-0-138-9 kubenswrapper[2581]: E0420 18:00:10.930577 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8\": container with ID starting with 0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8 not found: ID does not exist" containerID="0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8" Apr 20 18:00:10.930685 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.930612 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8"} err="failed to get container status \"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8\": rpc error: code = NotFound desc = could not find container \"0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8\": container with ID starting with 0fd6de917825705b04cd59a1a29e96ed3cf0262bb5422b8deda822a5bfd25ea8 not found: ID does not exist" Apr 20 18:00:10.943102 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.943074 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 18:00:10.946808 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:10.946787 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6cc5b84798-wzjqh"] Apr 20 18:00:11.766976 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:11.766942 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" path="/var/lib/kubelet/pods/6914e2d9-0b50-4ff7-97a5-b474b987ae0b/volumes" Apr 20 18:00:13.918646 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:13.918599 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:33.023989 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.023950 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk"] Apr 20 18:00:33.024502 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.024274 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" containerName="manager" Apr 20 18:00:33.024502 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.024286 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" containerName="manager" Apr 20 18:00:33.024502 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.024334 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6914e2d9-0b50-4ff7-97a5-b474b987ae0b" containerName="manager" Apr 20 18:00:33.030616 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.030594 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.034785 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.034758 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5v8cn\"" Apr 20 18:00:33.034785 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.034769 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 18:00:33.035003 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.034788 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 18:00:33.035003 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.034794 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 18:00:33.038457 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.038432 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk"] Apr 20 18:00:33.168867 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.168827 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.169068 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.168868 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc11b94f-330e-417f-881b-5a6d2b4a883c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.169068 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.168899 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.169068 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.168987 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2897\" (UniqueName: \"kubernetes.io/projected/fc11b94f-330e-417f-881b-5a6d2b4a883c-kube-api-access-c2897\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.169068 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.169053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.169251 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.169089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270321 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270285 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270321 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270565 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270371 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270565 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270388 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc11b94f-330e-417f-881b-5a6d2b4a883c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270565 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270407 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270565 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2897\" (UniqueName: \"kubernetes.io/projected/fc11b94f-330e-417f-881b-5a6d2b4a883c-kube-api-access-c2897\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270806 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270770 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.270874 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.270853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.271051 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.271037 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.272602 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.272584 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fc11b94f-330e-417f-881b-5a6d2b4a883c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.272878 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.272863 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fc11b94f-330e-417f-881b-5a6d2b4a883c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.279493 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.279426 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2897\" (UniqueName: \"kubernetes.io/projected/fc11b94f-330e-417f-881b-5a6d2b4a883c-kube-api-access-c2897\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk\" (UID: \"fc11b94f-330e-417f-881b-5a6d2b4a883c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.342263 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.342223 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:33.472222 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:33.472197 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk"] Apr 20 18:00:33.474611 ip-10-0-138-9 kubenswrapper[2581]: W0420 18:00:33.474582 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc11b94f_330e_417f_881b_5a6d2b4a883c.slice/crio-232003365ce3a17642ed7f8985413b0c7a5a9b29c2b082b3c7809855927e1ecc WatchSource:0}: Error finding container 232003365ce3a17642ed7f8985413b0c7a5a9b29c2b082b3c7809855927e1ecc: Status 404 returned error can't find the container with id 232003365ce3a17642ed7f8985413b0c7a5a9b29c2b082b3c7809855927e1ecc Apr 20 18:00:34.001433 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:34.001397 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" event={"ID":"fc11b94f-330e-417f-881b-5a6d2b4a883c","Type":"ContainerStarted","Data":"232003365ce3a17642ed7f8985413b0c7a5a9b29c2b082b3c7809855927e1ecc"} Apr 20 18:00:36.411212 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.411180 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:36.411672 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.411486 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-78db588b5d-2q6lj" podUID="dffcea60-e788-410c-9a83-0f4870b898cf" containerName="maas-api" containerID="cri-o://b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d" gracePeriod=30 Apr 20 18:00:36.687440 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.687315 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:36.808402 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.808351 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpcvs\" (UniqueName: \"kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs\") pod \"dffcea60-e788-410c-9a83-0f4870b898cf\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " Apr 20 18:00:36.808561 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.808469 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls\") pod \"dffcea60-e788-410c-9a83-0f4870b898cf\" (UID: \"dffcea60-e788-410c-9a83-0f4870b898cf\") " Apr 20 18:00:36.810447 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.810416 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "dffcea60-e788-410c-9a83-0f4870b898cf" (UID: "dffcea60-e788-410c-9a83-0f4870b898cf"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 18:00:36.810556 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.810472 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs" (OuterVolumeSpecName: "kube-api-access-xpcvs") pod "dffcea60-e788-410c-9a83-0f4870b898cf" (UID: "dffcea60-e788-410c-9a83-0f4870b898cf"). InnerVolumeSpecName "kube-api-access-xpcvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 18:00:36.909564 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.909519 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpcvs\" (UniqueName: \"kubernetes.io/projected/dffcea60-e788-410c-9a83-0f4870b898cf-kube-api-access-xpcvs\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 18:00:36.909564 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:36.909562 2581 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dffcea60-e788-410c-9a83-0f4870b898cf-maas-api-tls\") on node \"ip-10-0-138-9.ec2.internal\" DevicePath \"\"" Apr 20 18:00:37.012815 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.012727 2581 generic.go:358] "Generic (PLEG): container finished" podID="dffcea60-e788-410c-9a83-0f4870b898cf" containerID="b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d" exitCode=0 Apr 20 18:00:37.012815 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.012771 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-78db588b5d-2q6lj" event={"ID":"dffcea60-e788-410c-9a83-0f4870b898cf","Type":"ContainerDied","Data":"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d"} Apr 20 18:00:37.012815 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.012790 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-78db588b5d-2q6lj" Apr 20 18:00:37.012815 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.012802 2581 scope.go:117] "RemoveContainer" containerID="b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d" Apr 20 18:00:37.013175 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.012792 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-78db588b5d-2q6lj" event={"ID":"dffcea60-e788-410c-9a83-0f4870b898cf","Type":"ContainerDied","Data":"427a8f0d693ce7fcb48c603f77b03fbe75df708abc528e0366dd136855a17f86"} Apr 20 18:00:37.021826 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.021805 2581 scope.go:117] "RemoveContainer" containerID="b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d" Apr 20 18:00:37.022111 ip-10-0-138-9 kubenswrapper[2581]: E0420 18:00:37.022092 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d\": container with ID starting with b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d not found: ID does not exist" containerID="b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d" Apr 20 18:00:37.022160 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.022121 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d"} err="failed to get container status \"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d\": rpc error: code = NotFound desc = could not find container \"b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d\": container with ID starting with b59fdce95ae31df87d675954ca4e795694e9b7eabc0843130ab1e8dfdcebdb9d not found: ID does not exist" Apr 20 18:00:37.034452 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.034421 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:37.039063 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.039039 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-78db588b5d-2q6lj"] Apr 20 18:00:37.767008 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:37.766968 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffcea60-e788-410c-9a83-0f4870b898cf" path="/var/lib/kubelet/pods/dffcea60-e788-410c-9a83-0f4870b898cf/volumes" Apr 20 18:00:41.033164 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:41.033081 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" event={"ID":"fc11b94f-330e-417f-881b-5a6d2b4a883c","Type":"ContainerStarted","Data":"ed99ff12f022ee9159e17c10eb58ff80677410434483b1d083baa1ec9a31c91a"} Apr 20 18:00:47.052803 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:47.052763 2581 generic.go:358] "Generic (PLEG): container finished" podID="fc11b94f-330e-417f-881b-5a6d2b4a883c" containerID="ed99ff12f022ee9159e17c10eb58ff80677410434483b1d083baa1ec9a31c91a" exitCode=0 Apr 20 18:00:47.053295 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:47.052809 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" event={"ID":"fc11b94f-330e-417f-881b-5a6d2b4a883c","Type":"ContainerDied","Data":"ed99ff12f022ee9159e17c10eb58ff80677410434483b1d083baa1ec9a31c91a"} Apr 20 18:00:47.053548 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:47.053527 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 18:00:54.078447 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:54.078410 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" event={"ID":"fc11b94f-330e-417f-881b-5a6d2b4a883c","Type":"ContainerStarted","Data":"4d60fc64adc681fe3475868947968ea8a3560b2354df689d94fdcfe9cfccd1e2"} Apr 20 18:00:54.078907 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:54.078658 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:00:54.099890 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:00:54.099830 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" podStartSLOduration=1.234211084 podStartE2EDuration="21.099813622s" podCreationTimestamp="2026-04-20 18:00:33 +0000 UTC" firstStartedPulling="2026-04-20 18:00:33.476372977 +0000 UTC m=+752.251903749" lastFinishedPulling="2026-04-20 18:00:53.341975514 +0000 UTC m=+772.117506287" observedRunningTime="2026-04-20 18:00:54.097409797 +0000 UTC m=+772.872940605" watchObservedRunningTime="2026-04-20 18:00:54.099813622 +0000 UTC m=+772.875344416" Apr 20 18:01:05.096612 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:05.096581 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk" Apr 20 18:01:12.426183 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.426139 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw"] Apr 20 18:01:12.426575 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.426520 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dffcea60-e788-410c-9a83-0f4870b898cf" containerName="maas-api" Apr 20 18:01:12.426575 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.426534 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffcea60-e788-410c-9a83-0f4870b898cf" containerName="maas-api" Apr 20 18:01:12.426681 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.426602 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="dffcea60-e788-410c-9a83-0f4870b898cf" containerName="maas-api" Apr 20 18:01:12.461573 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.461530 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw"] Apr 20 18:01:12.461751 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.461696 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.464447 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.464425 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 18:01:12.615284 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615245 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.615284 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615298 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.615563 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615361 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.615563 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615383 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.615563 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfjf\" (UniqueName: \"kubernetes.io/projected/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kube-api-access-9hfjf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.615563 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.615497 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.715966 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.715870 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfjf\" (UniqueName: \"kubernetes.io/projected/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kube-api-access-9hfjf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.715966 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.715922 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.715966 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.715951 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716269 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.715979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716269 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.716016 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716269 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.716055 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716561 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.716528 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716714 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.716557 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.716714 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.716577 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.718437 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.718407 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.718521 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.718507 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.723599 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.723575 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfjf\" (UniqueName: \"kubernetes.io/projected/0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c-kube-api-access-9hfjf\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw\" (UID: \"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.770819 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.770777 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:12.900844 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:12.900741 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw"] Apr 20 18:01:12.903492 ip-10-0-138-9 kubenswrapper[2581]: W0420 18:01:12.903464 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af7d2d1_50d4_4fcc_8187_86ffdcb0c58c.slice/crio-a2f7e068571966cc48f5de6cde56fb58f1347cd5cc8078594245968cbcb94f4f WatchSource:0}: Error finding container a2f7e068571966cc48f5de6cde56fb58f1347cd5cc8078594245968cbcb94f4f: Status 404 returned error can't find the container with id a2f7e068571966cc48f5de6cde56fb58f1347cd5cc8078594245968cbcb94f4f Apr 20 18:01:13.142615 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:13.142486 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" event={"ID":"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c","Type":"ContainerStarted","Data":"2db56bf8238d6f8e5aae5f5ba9ce9bc01e0f204fbeb81720e0c74102fd50d40d"} Apr 20 18:01:13.142615 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:13.142523 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" event={"ID":"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c","Type":"ContainerStarted","Data":"a2f7e068571966cc48f5de6cde56fb58f1347cd5cc8078594245968cbcb94f4f"} Apr 20 18:01:22.172757 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:22.172715 2581 generic.go:358] "Generic (PLEG): container finished" podID="0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c" containerID="2db56bf8238d6f8e5aae5f5ba9ce9bc01e0f204fbeb81720e0c74102fd50d40d" exitCode=0 Apr 20 18:01:22.172757 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:22.172749 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" event={"ID":"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c","Type":"ContainerDied","Data":"2db56bf8238d6f8e5aae5f5ba9ce9bc01e0f204fbeb81720e0c74102fd50d40d"} Apr 20 18:01:23.177396 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:23.177355 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" event={"ID":"0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c","Type":"ContainerStarted","Data":"bb0b5dd5ffc589796a76ded6071b0048b357608f908109f4544936b24cfb102c"} Apr 20 18:01:23.177853 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:23.177575 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:23.195990 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:23.195928 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" podStartSLOduration=10.941717605000001 podStartE2EDuration="11.195896735s" podCreationTimestamp="2026-04-20 18:01:12 +0000 UTC" firstStartedPulling="2026-04-20 18:01:22.17336609 +0000 UTC m=+800.948896863" lastFinishedPulling="2026-04-20 18:01:22.427545217 +0000 UTC m=+801.203075993" observedRunningTime="2026-04-20 18:01:23.195868056 +0000 UTC m=+801.971398852" watchObservedRunningTime="2026-04-20 18:01:23.195896735 +0000 UTC m=+801.971427533" Apr 20 18:01:32.641350 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.641315 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv"] Apr 20 18:01:32.667697 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.667662 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv"] Apr 20 18:01:32.667862 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.667790 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.670532 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.670506 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 18:01:32.794933 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.794894 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.794933 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.794933 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.795145 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.794977 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55k2c\" (UniqueName: \"kubernetes.io/projected/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kube-api-access-55k2c\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.795145 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.795001 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.795145 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.795044 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.795145 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.795069 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895509 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895421 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895509 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895464 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895763 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895763 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895540 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895763 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895567 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55k2c\" (UniqueName: \"kubernetes.io/projected/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kube-api-access-55k2c\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.895763 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895694 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.896007 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.895983 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.896100 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.896013 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.896100 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.896079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.897837 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.897813 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.898241 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.898219 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.904754 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.904723 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55k2c\" (UniqueName: \"kubernetes.io/projected/67b409c2-3e24-41f5-bcef-2f1f0f43bc64-kube-api-access-55k2c\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv\" (UID: \"67b409c2-3e24-41f5-bcef-2f1f0f43bc64\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:32.978197 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:32.978147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:33.138332 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:33.138304 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv"] Apr 20 18:01:33.140027 ip-10-0-138-9 kubenswrapper[2581]: W0420 18:01:33.139992 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b409c2_3e24_41f5_bcef_2f1f0f43bc64.slice/crio-944c16478397b43aaa7448c32b2a11466b4d1610fefb0edd457fc096359d2865 WatchSource:0}: Error finding container 944c16478397b43aaa7448c32b2a11466b4d1610fefb0edd457fc096359d2865: Status 404 returned error can't find the container with id 944c16478397b43aaa7448c32b2a11466b4d1610fefb0edd457fc096359d2865 Apr 20 18:01:33.214576 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:33.214543 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" event={"ID":"67b409c2-3e24-41f5-bcef-2f1f0f43bc64","Type":"ContainerStarted","Data":"944c16478397b43aaa7448c32b2a11466b4d1610fefb0edd457fc096359d2865"} Apr 20 18:01:34.195013 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:34.194974 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw" Apr 20 18:01:34.218575 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:34.218536 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" event={"ID":"67b409c2-3e24-41f5-bcef-2f1f0f43bc64","Type":"ContainerStarted","Data":"812c4d7dd842a78d81e0ab317041f5a03fae661829e506eae8d1992a92bad90a"} Apr 20 18:01:39.236215 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:39.236182 2581 generic.go:358] "Generic (PLEG): container finished" podID="67b409c2-3e24-41f5-bcef-2f1f0f43bc64" containerID="812c4d7dd842a78d81e0ab317041f5a03fae661829e506eae8d1992a92bad90a" exitCode=0 Apr 20 18:01:39.236650 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:39.236264 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" event={"ID":"67b409c2-3e24-41f5-bcef-2f1f0f43bc64","Type":"ContainerDied","Data":"812c4d7dd842a78d81e0ab317041f5a03fae661829e506eae8d1992a92bad90a"} Apr 20 18:01:40.240976 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:40.240942 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" event={"ID":"67b409c2-3e24-41f5-bcef-2f1f0f43bc64","Type":"ContainerStarted","Data":"8a689baae2eb7c90d4046a2a3c3ef2dab48d0c69ed6c3377a685e6e787b03fe4"} Apr 20 18:01:40.241373 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:40.241154 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:01:40.286860 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:40.286807 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" podStartSLOduration=7.93932729 podStartE2EDuration="8.286791407s" podCreationTimestamp="2026-04-20 18:01:32 +0000 UTC" firstStartedPulling="2026-04-20 18:01:39.23689197 +0000 UTC m=+818.012422742" lastFinishedPulling="2026-04-20 18:01:39.584356086 +0000 UTC m=+818.359886859" observedRunningTime="2026-04-20 18:01:40.283243941 +0000 UTC m=+819.058774743" watchObservedRunningTime="2026-04-20 18:01:40.286791407 +0000 UTC m=+819.062322260" Apr 20 18:01:51.257333 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:01:51.257301 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv" Apr 20 18:03:39.207314 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:39.207273 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-vjnzh_3f6e3122-c4bd-400d-9578-aaf92e911ccf/manager/0.log" Apr 20 18:03:39.436789 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:39.436746 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f97df6549-xk7cs_6a347933-69be-4214-94a8-781636e2d17b/manager/0.log" Apr 20 18:03:39.585078 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:39.584989 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-lnffx_5c45cee7-7c81-4c6f-8550-4943f0ec1b33/manager/2.log" Apr 20 18:03:39.705383 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:39.705351 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-4f5ss_883ebf5e-88bc-4638-8ef1-8d3bf51e49ad/manager/0.log" Apr 20 18:03:40.078111 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:40.078081 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-z626m_51daf027-c1b9-465f-8f03-d53c5ba63343/postgres/0.log" Apr 20 18:03:41.404638 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:41.404592 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-wcpf7_181ed309-e548-4308-9955-e76168b875c6/manager/0.log" Apr 20 18:03:42.453476 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:42.453442 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-76zmb_56f9588d-d0d6-4172-9969-5f3f8ebaba4b/discovery/0.log" Apr 20 18:03:42.657583 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:42.657551 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-554dd5dd7d-xg5d4_d4cf93b3-09a3-4d10-997c-5b3c897b6be6/kube-auth-proxy/0.log" Apr 20 18:03:42.765374 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:42.765290 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-mwx4g_0a59f0c1-7444-4dd6-a891-f19c6ffef70c/istio-proxy/0.log" Apr 20 18:03:43.514566 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.514523 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv_67b409c2-3e24-41f5-bcef-2f1f0f43bc64/storage-initializer/0.log" Apr 20 18:03:43.521282 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.521257 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccqgskv_67b409c2-3e24-41f5-bcef-2f1f0f43bc64/main/0.log" Apr 20 18:03:43.623265 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.623236 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk_fc11b94f-330e-417f-881b-5a6d2b4a883c/storage-initializer/0.log" Apr 20 18:03:43.629475 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.629452 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-xhxlk_fc11b94f-330e-417f-881b-5a6d2b4a883c/main/0.log" Apr 20 18:03:43.732001 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.731976 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw_0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c/storage-initializer/0.log" Apr 20 18:03:43.738740 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:43.738719 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-d6lpw_0af7d2d1-50d4-4fcc-8187-86ffdcb0c58c/main/0.log" Apr 20 18:03:50.490463 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:50.490435 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xrxtp_9041de32-fafa-4935-a258-8c6ecce98d75/global-pull-secret-syncer/0.log" Apr 20 18:03:50.591138 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:50.591108 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r8qd7_d36b98c1-fc6f-4438-84d5-25382aad1dc6/konnectivity-agent/0.log" Apr 20 18:03:50.656465 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:50.656426 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-9.ec2.internal_67dd5e3ce0ce81437a1036d614e3ee5e/haproxy/0.log" Apr 20 18:03:54.879313 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:54.879283 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-wcpf7_181ed309-e548-4308-9955-e76168b875c6/manager/0.log" Apr 20 18:03:56.830121 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:56.830034 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcpnp_4dbb2025-5637-4338-971e-382de0d4f73b/node-exporter/0.log" Apr 20 18:03:56.860218 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:56.860175 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcpnp_4dbb2025-5637-4338-971e-382de0d4f73b/kube-rbac-proxy/0.log" Apr 20 18:03:56.886269 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:56.886236 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcpnp_4dbb2025-5637-4338-971e-382de0d4f73b/init-textfile/0.log" Apr 20 18:03:58.656827 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:58.656789 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-z92d2_f82115f5-7c80-4334-9e2c-bf493509b8ca/networking-console-plugin/0.log" Apr 20 18:03:59.182034 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.181997 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2"] Apr 20 18:03:59.185491 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.185467 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.188208 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.188188 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"openshift-service-ca.crt\"" Apr 20 18:03:59.189348 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.189328 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dqzrr\"/\"default-dockercfg-mm66t\"" Apr 20 18:03:59.189415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.189368 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"kube-root-ca.crt\"" Apr 20 18:03:59.192798 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.192377 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2"] Apr 20 18:03:59.290470 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.290431 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-podres\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.290470 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.290476 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-lib-modules\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.290728 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.290542 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-proc\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.290728 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.290591 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghr5\" (UniqueName: \"kubernetes.io/projected/ce2b1480-51cc-4a34-a732-f05f201e952e-kube-api-access-jghr5\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.290728 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.290614 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-sys\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391154 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391104 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-podres\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391154 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391157 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-lib-modules\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391196 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-proc\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391230 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jghr5\" (UniqueName: \"kubernetes.io/projected/ce2b1480-51cc-4a34-a732-f05f201e952e-kube-api-access-jghr5\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391263 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-sys\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391285 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-podres\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391311 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-proc\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391338 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-sys\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.391415 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.391359 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce2b1480-51cc-4a34-a732-f05f201e952e-lib-modules\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.400190 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.400167 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghr5\" (UniqueName: \"kubernetes.io/projected/ce2b1480-51cc-4a34-a732-f05f201e952e-kube-api-access-jghr5\") pod \"perf-node-gather-daemonset-fcrx2\" (UID: \"ce2b1480-51cc-4a34-a732-f05f201e952e\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.497399 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.497307 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.625519 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.625484 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2"] Apr 20 18:03:59.629813 ip-10-0-138-9 kubenswrapper[2581]: W0420 18:03:59.629784 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podce2b1480_51cc_4a34_a732_f05f201e952e.slice/crio-7eb52ef1bdd0bce2dc2013ce5c4cfdbf3645445e493f355ace7fb7d213372485 WatchSource:0}: Error finding container 7eb52ef1bdd0bce2dc2013ce5c4cfdbf3645445e493f355ace7fb7d213372485: Status 404 returned error can't find the container with id 7eb52ef1bdd0bce2dc2013ce5c4cfdbf3645445e493f355ace7fb7d213372485 Apr 20 18:03:59.706734 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.706693 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" event={"ID":"ce2b1480-51cc-4a34-a732-f05f201e952e","Type":"ContainerStarted","Data":"6b139571bc5faa2864b90b3641425e38ab254204c589d4b907a8b04ce1e7926d"} Apr 20 18:03:59.706734 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.706739 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" event={"ID":"ce2b1480-51cc-4a34-a732-f05f201e952e","Type":"ContainerStarted","Data":"7eb52ef1bdd0bce2dc2013ce5c4cfdbf3645445e493f355ace7fb7d213372485"} Apr 20 18:03:59.707157 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.706846 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:03:59.722662 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:03:59.722598 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" podStartSLOduration=0.722583606 podStartE2EDuration="722.583606ms" podCreationTimestamp="2026-04-20 18:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 18:03:59.720080183 +0000 UTC m=+958.495610974" watchObservedRunningTime="2026-04-20 18:03:59.722583606 +0000 UTC m=+958.498114400" Apr 20 18:04:01.136370 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:01.136328 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6j2d_9ada05b5-e0d5-449c-83d5-41ed76dac3ee/dns/0.log" Apr 20 18:04:01.156962 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:01.156933 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6j2d_9ada05b5-e0d5-449c-83d5-41ed76dac3ee/kube-rbac-proxy/0.log" Apr 20 18:04:01.224467 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:01.224436 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n9gmk_a1c5a7fc-92d7-49a4-85c8-128fe8e46b19/dns-node-resolver/0.log" Apr 20 18:04:01.718948 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:01.718918 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g8b7h_1a64b0a7-defc-4ef8-b833-3e4b069784b3/node-ca/0.log" Apr 20 18:04:02.596521 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:02.596488 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-76zmb_56f9588d-d0d6-4172-9969-5f3f8ebaba4b/discovery/0.log" Apr 20 18:04:02.638795 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:02.638764 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-554dd5dd7d-xg5d4_d4cf93b3-09a3-4d10-997c-5b3c897b6be6/kube-auth-proxy/0.log" Apr 20 18:04:02.664864 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:02.664835 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-mwx4g_0a59f0c1-7444-4dd6-a891-f19c6ffef70c/istio-proxy/0.log" Apr 20 18:04:03.297567 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:03.297537 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tqj4h_baa515c8-8724-40c1-a30b-d562783453d5/serve-healthcheck-canary/0.log" Apr 20 18:04:03.974804 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:03.974777 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-txgpz_7229afca-8988-40d4-85ba-ee0d637055e8/kube-rbac-proxy/0.log" Apr 20 18:04:03.994744 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:03.994714 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-txgpz_7229afca-8988-40d4-85ba-ee0d637055e8/exporter/0.log" Apr 20 18:04:04.015740 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:04.015715 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-txgpz_7229afca-8988-40d4-85ba-ee0d637055e8/extractor/0.log" Apr 20 18:04:05.720711 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.720681 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-fcrx2" Apr 20 18:04:05.849110 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.849076 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-vjnzh_3f6e3122-c4bd-400d-9578-aaf92e911ccf/manager/0.log" Apr 20 18:04:05.894672 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.894639 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f97df6549-xk7cs_6a347933-69be-4214-94a8-781636e2d17b/manager/0.log" Apr 20 18:04:05.914369 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.914342 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-lnffx_5c45cee7-7c81-4c6f-8550-4943f0ec1b33/manager/1.log" Apr 20 18:04:05.924445 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.924414 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-lnffx_5c45cee7-7c81-4c6f-8550-4943f0ec1b33/manager/2.log" Apr 20 18:04:05.949706 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:05.949682 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-4f5ss_883ebf5e-88bc-4638-8ef1-8d3bf51e49ad/manager/0.log" Apr 20 18:04:06.049066 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:06.048979 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-z626m_51daf027-c1b9-465f-8f03-d53c5ba63343/postgres/0.log" Apr 20 18:04:07.206003 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:07.205971 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-fwcp5_39f7cf9c-c077-46d9-b01b-ef43218f5211/openshift-lws-operator/0.log" Apr 20 18:04:13.304194 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.304122 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/kube-multus-additional-cni-plugins/0.log" Apr 20 18:04:13.326186 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.326158 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/egress-router-binary-copy/0.log" Apr 20 18:04:13.349134 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.349110 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/cni-plugins/0.log" Apr 20 18:04:13.373109 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.373082 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/bond-cni-plugin/0.log" Apr 20 18:04:13.397352 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.397325 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/routeoverride-cni/0.log" Apr 20 18:04:13.432690 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.432656 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/whereabouts-cni-bincopy/0.log" Apr 20 18:04:13.457169 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.457142 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9xc6w_6667bb3e-c213-4fb7-a2f0-bb9a65372bf3/whereabouts-cni/0.log" Apr 20 18:04:13.671006 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.670979 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v8gth_70ebccba-9caf-4e18-b7c2-430622fd3b07/kube-multus/0.log" Apr 20 18:04:13.694588 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.694561 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7gff7_c4102ca4-2dfd-487f-85a4-c91b3ae6797e/network-metrics-daemon/0.log" Apr 20 18:04:13.721338 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:13.721310 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7gff7_c4102ca4-2dfd-487f-85a4-c91b3ae6797e/kube-rbac-proxy/0.log" Apr 20 18:04:14.648104 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.648071 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/ovn-controller/0.log" Apr 20 18:04:14.672004 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.671969 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/ovn-acl-logging/0.log" Apr 20 18:04:14.693187 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.693155 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/kube-rbac-proxy-node/0.log" Apr 20 18:04:14.716308 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.716271 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 18:04:14.734125 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.734101 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/northd/0.log" Apr 20 18:04:14.754272 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.754241 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/nbdb/0.log" Apr 20 18:04:14.774651 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.774605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/sbdb/0.log" Apr 20 18:04:14.869389 ip-10-0-138-9 kubenswrapper[2581]: I0420 18:04:14.869309 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lbls6_a41b8e52-ae34-439f-84de-ee703e85e441/ovnkube-controller/0.log"