Apr 21 02:40:32.192897 ip-10-0-137-147 systemd[1]: Starting Kubernetes Kubelet... Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.660148 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667376 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667389 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667394 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667397 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667400 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667409 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667413 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:40:32.799908 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667417 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:40:32.775376 ip-10-0-137-147 systemd[1]: Started Kubernetes Kubelet. Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667420 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667422 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667425 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667428 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667431 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667433 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667436 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667438 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667441 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667444 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667447 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667449 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667452 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667455 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667457 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667460 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667463 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667466 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667468 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:40:32.802445 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667471 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667475 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667479 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667482 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667485 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667487 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667490 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667492 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667495 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667498 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667500 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667504 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667506 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667508 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667511 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667516 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667530 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667534 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:40:32.803405 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667537 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667540 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667543 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667547 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667550 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667553 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667556 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667558 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667561 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667564 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667566 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667570 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667572 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667575 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667579 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667581 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667584 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667587 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667589 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667592 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667594 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:40:32.804449 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667597 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667600 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667602 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667605 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667608 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667611 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667614 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667618 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667620 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667623 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667625 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667628 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667630 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667634 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667637 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667639 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667642 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667645 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667648 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.667650 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:40:32.805228 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669789 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669807 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669811 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669815 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669821 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669825 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669828 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669831 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669834 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669837 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669840 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669843 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669846 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669850 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669853 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669856 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669859 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669867 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669872 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:40:32.805942 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669876 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669880 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669884 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669889 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669893 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669896 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669899 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669902 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669906 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669911 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669916 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669921 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669926 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669942 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669947 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669952 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669955 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669958 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669961 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669964 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:40:32.806867 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669967 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669972 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669975 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669978 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669981 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669984 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669987 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669990 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669992 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669995 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.669998 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670001 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670004 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670006 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670011 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670014 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670017 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670020 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670022 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670025 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:40:32.807937 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670028 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670031 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670034 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670038 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670041 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670044 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670046 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670052 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670055 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670058 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670060 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670063 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670066 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670068 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670071 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670076 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670079 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670084 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670090 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670094 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:40:32.808903 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670098 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670101 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670105 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670108 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670111 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670115 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.670118 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670257 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670266 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670277 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670282 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670287 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670293 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670300 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670308 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670315 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670328 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670332 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670336 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670339 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670345 2572 flags.go:64] FLAG: --cgroup-root="" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670350 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 02:40:32.809841 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670852 2572 flags.go:64] FLAG: --client-ca-file="" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670871 2572 flags.go:64] FLAG: --cloud-config="" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670875 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670879 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670886 2572 flags.go:64] FLAG: --cluster-domain="" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670889 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670893 2572 flags.go:64] FLAG: --config-dir="" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670896 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670900 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670906 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670909 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670915 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670919 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670922 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670925 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670928 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670932 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670936 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670941 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670944 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670947 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670950 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670953 2572 flags.go:64] FLAG: --enable-server="true" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670956 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670962 2572 flags.go:64] FLAG: --event-burst="100" Apr 21 02:40:32.810494 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670965 2572 flags.go:64] FLAG: --event-qps="50" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670969 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670972 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670976 2572 flags.go:64] FLAG: --eviction-hard="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670980 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670984 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670987 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670990 2572 flags.go:64] FLAG: --eviction-soft="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670993 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670996 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.670999 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671002 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671005 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671009 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671012 2572 flags.go:64] FLAG: --feature-gates="" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671016 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671019 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671022 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671026 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671030 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671034 2572 flags.go:64] FLAG: --help="false" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671037 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-137-147.ec2.internal" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671040 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671044 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 02:40:32.811681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671047 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671051 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671054 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671058 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671061 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671064 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671067 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671070 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671074 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671077 2572 flags.go:64] FLAG: --kube-reserved="" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671080 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671083 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671086 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671089 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671092 2572 flags.go:64] FLAG: --lock-file="" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671094 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671097 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671100 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671106 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671109 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671112 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671115 2572 flags.go:64] FLAG: --logging-format="text" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671118 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671122 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 02:40:32.812999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671125 2572 flags.go:64] FLAG: --manifest-url="" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671127 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671133 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671137 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671141 2572 flags.go:64] FLAG: --max-pods="110" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671145 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671148 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671151 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671154 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671157 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671160 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671163 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671171 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671174 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671182 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671185 2572 flags.go:64] FLAG: --pod-cidr="" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671188 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671193 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671196 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671199 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671202 2572 flags.go:64] FLAG: --port="10250" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671206 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671209 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dc9069eb5880b5a2" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671212 2572 flags.go:64] FLAG: --qos-reserved="" Apr 21 02:40:32.813773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671215 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671218 2572 flags.go:64] FLAG: --register-node="true" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671221 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671224 2572 flags.go:64] FLAG: --register-with-taints="" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671233 2572 flags.go:64] FLAG: --registry-burst="10" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671236 2572 flags.go:64] FLAG: --registry-qps="5" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671239 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671243 2572 flags.go:64] FLAG: --reserved-memory="" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671246 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671249 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671252 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671258 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671262 2572 flags.go:64] FLAG: --runonce="false" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671265 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671268 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671271 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671275 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671278 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671281 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671284 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671287 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671290 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671294 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671297 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671300 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671303 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 02:40:32.815174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671306 2572 flags.go:64] FLAG: --system-cgroups="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671309 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671315 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671318 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671321 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671325 2572 flags.go:64] FLAG: --tls-min-version="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671328 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671331 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671334 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671337 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671340 2572 flags.go:64] FLAG: --v="2" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671345 2572 flags.go:64] FLAG: --version="false" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671349 2572 flags.go:64] FLAG: --vmodule="" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671354 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.671357 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671459 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671464 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671468 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671472 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671477 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671480 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671483 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671485 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:40:32.815983 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671488 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671490 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671493 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671496 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671498 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671502 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671505 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671507 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671510 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671513 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671515 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671534 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671537 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671540 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671542 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671545 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671548 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671550 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671553 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671556 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:40:32.816759 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671560 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671562 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671565 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671567 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671570 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671573 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671577 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671580 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671583 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671586 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671588 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671591 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671594 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671596 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671599 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671602 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671604 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671608 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671611 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:40:32.817381 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671614 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671616 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671619 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671621 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671624 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671626 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671629 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671632 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671634 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671637 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671639 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671641 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671644 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671647 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671649 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671652 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671654 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671657 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671661 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671665 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:40:32.818068 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671668 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671670 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671673 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671676 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671679 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671681 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671684 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671686 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671689 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671691 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671696 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671699 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671701 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671704 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671706 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671709 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671713 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671717 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:40:32.818690 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.671721 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.672498 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.679165 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.679183 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679229 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679235 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679238 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679241 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679244 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679247 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679250 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679253 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679255 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679258 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679261 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679263 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:40:32.819244 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679266 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679269 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679272 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679276 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679280 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679283 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679286 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679289 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679291 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679294 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679297 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679300 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679305 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679308 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679311 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679315 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679318 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679321 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679324 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:40:32.819790 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679328 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679331 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679333 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679336 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679338 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679341 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679344 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679346 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679349 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679351 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679354 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679356 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679359 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679362 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679364 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679367 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679370 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679373 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679376 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679378 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:40:32.820383 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679381 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679383 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679386 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679388 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679391 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679395 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679397 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679400 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679402 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679405 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679407 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679410 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679412 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679416 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679418 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679421 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679423 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679425 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679428 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679431 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:40:32.821265 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679433 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679435 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679438 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679441 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679444 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679447 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679449 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679452 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679454 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679457 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679460 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679462 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679465 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679468 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679470 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:40:32.822096 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.679475 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679590 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679597 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679600 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679604 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679607 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679609 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679612 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679616 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679618 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679621 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679625 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679627 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679630 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679633 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679635 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679641 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679643 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679646 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679649 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 02:40:32.822817 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679651 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679654 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679657 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679660 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679664 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679667 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679670 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679672 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679675 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679678 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679681 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679683 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679686 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679688 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679693 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679695 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679698 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679701 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679703 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 02:40:32.823646 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679706 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679709 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679711 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679714 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679716 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679719 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679722 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679725 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679727 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679731 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679734 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679736 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679739 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679742 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679744 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679747 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679749 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679753 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679757 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 02:40:32.824286 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679760 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679762 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679765 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679768 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679770 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679773 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679775 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679778 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679781 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679784 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679786 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679789 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679791 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679794 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679796 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679799 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679801 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679804 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679806 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679809 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 02:40:32.824956 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679812 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679815 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679818 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679821 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679824 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679826 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679829 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679831 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:32.679834 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.679839 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.679950 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.683363 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.684381 2572 server.go:1019] "Starting client certificate rotation" Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.684469 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:40:32.825714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.684513 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.713470 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.716122 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.732131 2572 log.go:25] "Validated CRI v1 runtime API" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.740144 2572 log.go:25] "Validated CRI v1 image API" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.741607 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.745644 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.746826 2572 fs.go:135] Filesystem UUIDs: map[52fa4a3e-7f00-418f-8980-e7f734240394:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 7f6b94a7-f614-4223-8e09-25498b7a4cc8:/dev/nvme0n1p4] Apr 21 02:40:32.826313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.746841 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.751481 2572 manager.go:217] Machine: {Timestamp:2026-04-21 02:40:32.750181942 +0000 UTC m=+0.434726196 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101247 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec285f34ab9978c9e692c1acd1a47c70 SystemUUID:ec285f34-ab99-78c9-e692-c1acd1a47c70 BootID:17d46329-f8bd-43a3-8fdb-85770ef57768 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8d:3e:c7:95:e9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8d:3e:c7:95:e9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:ca:5f:40:f9:19 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.751600 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.751683 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.754899 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.754925 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-147.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.755071 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.755080 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.755093 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.755955 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.758216 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.758331 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.761217 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.761230 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.761245 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.761255 2572 kubelet.go:397] "Adding apiserver pod source" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.761265 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.762426 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:40:33.080032 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.762440 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.765806 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.768034 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769504 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769533 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769541 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769546 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769552 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769559 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769564 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769569 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769576 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769582 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769590 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.769599 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.770561 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.770568 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.774148 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-147.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.774160 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.774212 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.774439 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.774470 2572 server.go:1295] "Started kubelet" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.774550 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.775109 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.777099 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.777939 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.779225 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.785929 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qlxds" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.785655 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-147.ec2.internal.18a83eeb2c055c85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-147.ec2.internal,UID:ip-10-0-137-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-147.ec2.internal,},FirstTimestamp:2026-04-21 02:40:32.774446213 +0000 UTC m=+0.458990464,LastTimestamp:2026-04-21 02:40:32.774446213 +0000 UTC m=+0.458990464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-147.ec2.internal,}" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.787669 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.788402 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.788467 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789148 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789162 2572 factory.go:55] Registering systemd factory Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789171 2572 factory.go:223] Registration of the systemd container factory successfully Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789243 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789244 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789261 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789333 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789341 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789357 2572 factory.go:153] Registering CRI-O factory Apr 21 02:40:33.081319 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789370 2572 factory.go:223] Registration of the crio container factory successfully Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789400 2572 factory.go:103] Registering Raw factory Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.789413 2572 manager.go:1196] Started watching for new ooms in manager Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.789547 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.790175 2572 manager.go:319] Starting recovery of all containers Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.793000 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qlxds" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.800166 2572 manager.go:324] Recovery completed Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.801170 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.803682 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-147.ec2.internal\" not found" node="ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.806055 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.811683 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.811725 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.811738 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.812415 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.812428 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.812459 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.818283 2572 policy_none.go:49] "None policy: Start" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.818313 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.818328 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867133 2572 manager.go:341] "Starting Device Plugin manager" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.867167 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867180 2572 server.go:85] "Starting device plugin registration server" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867465 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867480 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867578 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867670 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.867679 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.868231 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.868272 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.902799 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.904037 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.904064 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.904086 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.904092 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.904128 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.907447 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.968510 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.969458 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.969487 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.969499 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.969553 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:32.979296 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.979318 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-147.ec2.internal\": node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:32.992695 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.004698 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal"] Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.004780 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.008274 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.082726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.008299 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.008310 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.009866 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.010011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.010042 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011079 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011101 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011111 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011081 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011174 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.011186 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.012256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.012280 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.013033 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientMemory" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.013054 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.013064 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeHasSufficientPID" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.035091 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-147.ec2.internal\" not found" node="ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.084072 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.038517 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-147.ec2.internal\" not found" node="ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.091681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.091709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be812a68c721006fc84b840bb8d76277-config\") pod \"kube-apiserver-proxy-ip-10-0-137-147.ec2.internal\" (UID: \"be812a68c721006fc84b840bb8d76277\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.091730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.093739 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be812a68c721006fc84b840bb8d76277-config\") pod \"kube-apiserver-proxy-ip-10-0-137-147.ec2.internal\" (UID: \"be812a68c721006fc84b840bb8d76277\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b571a1977453700e90c1fa2adc7c4324-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal\" (UID: \"b571a1977453700e90c1fa2adc7c4324\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.192386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be812a68c721006fc84b840bb8d76277-config\") pod \"kube-apiserver-proxy-ip-10-0-137-147.ec2.internal\" (UID: \"be812a68c721006fc84b840bb8d76277\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.194338 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.295281 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.337459 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.391948 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.340969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:33.648234 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.395386 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.648234 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.495785 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.648234 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.596358 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.684575 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.684727 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.684745 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.696920 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.788393 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.794993 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 02:35:32 +0000 UTC" deadline="2027-11-04 00:54:05.217229638 +0000 UTC" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.795018 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13486h13m31.422214693s" Apr 21 02:40:33.803200 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.797469 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:33.807996 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.807537 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.827638 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pjmgg" Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.835871 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pjmgg" Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:33.889185 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb571a1977453700e90c1fa2adc7c4324.slice/crio-2290611c2cbb68a4eda9b4e64356177207827f07e3bad00702c4da83a333a773 WatchSource:0}: Error finding container 2290611c2cbb68a4eda9b4e64356177207827f07e3bad00702c4da83a333a773: Status 404 returned error can't find the container with id 2290611c2cbb68a4eda9b4e64356177207827f07e3bad00702c4da83a333a773 Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.895294 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.898035 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:33.898857 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe812a68c721006fc84b840bb8d76277.slice/crio-1a108c1273d0859a3ba3fc6a8877a44bee709bdaf625b868b4074c994ae92512 WatchSource:0}: Error finding container 1a108c1273d0859a3ba3fc6a8877a44bee709bdaf625b868b4074c994ae92512: Status 404 returned error can't find the container with id 1a108c1273d0859a3ba3fc6a8877a44bee709bdaf625b868b4074c994ae92512 Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.907981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" event={"ID":"be812a68c721006fc84b840bb8d76277","Type":"ContainerStarted","Data":"1a108c1273d0859a3ba3fc6a8877a44bee709bdaf625b868b4074c994ae92512"} Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:33.908840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" event={"ID":"b571a1977453700e90c1fa2adc7c4324","Type":"ContainerStarted","Data":"2290611c2cbb68a4eda9b4e64356177207827f07e3bad00702c4da83a333a773"} Apr 21 02:40:34.046400 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:33.998361 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.098760 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-147.ec2.internal\" not found" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.135885 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.189322 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.201098 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.202653 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.210593 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 02:40:34.300134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.224480 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:34.763316 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.763276 2572 apiserver.go:52] "Watching apiserver" Apr 21 02:40:34.771334 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.771305 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 02:40:34.771833 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.771804 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-48qgw","openshift-multus/network-metrics-daemon-bzdk8","kube-system/global-pull-secret-syncer-xlxwj","openshift-cluster-node-tuning-operator/tuned-rtcq6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal","openshift-multus/multus-additional-cni-plugins-kqfw8","openshift-network-diagnostics/network-check-target-cbjx7","openshift-network-operator/iptables-alerter-wcs7x","openshift-ovn-kubernetes/ovnkube-node-cpdzb","kube-system/konnectivity-agent-69dnr","kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f","openshift-dns/node-resolver-9mbn4","openshift-image-registry/node-ca-wwdwn"] Apr 21 02:40:34.774437 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.774373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.776489 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.776466 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:34.776634 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.776545 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-spxj6\"" Apr 21 02:40:34.776634 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.776559 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:34.776726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.776637 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 02:40:34.776726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.776667 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 02:40:34.780579 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.780560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.782261 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.782238 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.782451 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.782435 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 02:40:34.782515 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.782504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.782601 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.782586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v66n5\"" Apr 21 02:40:34.783224 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.783208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.783288 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.783267 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:34.783400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.783385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.785129 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.785104 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.785203 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.785141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4wd74\"" Apr 21 02:40:34.785345 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.785328 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.785617 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.785599 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.787661 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.787500 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 02:40:34.787661 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.787564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rmwfk\"" Apr 21 02:40:34.787661 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.787595 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.787866 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.787681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.788069 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.787918 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 02:40:34.788069 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.788017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.788593 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.788576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 02:40:34.789853 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.789827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 02:40:34.789942 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.789881 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jht6t\"" Apr 21 02:40:34.790387 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.790327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.790468 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.790396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.791036 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.790917 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 02:40:34.791803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.791387 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 02:40:34.791803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.791660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 02:40:34.791803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.791728 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.792787 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.792283 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zcs4l\"" Apr 21 02:40:34.792787 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.792310 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.792787 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.792694 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.794933 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.794912 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:34.795022 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.795006 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:34.797508 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.797489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.799417 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.799397 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 02:40:34.799512 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.799431 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fhqk2\"" Apr 21 02:40:34.799768 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.799753 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:34.800904 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysconfig\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.800999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-os-release\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.800999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7kb\" (UniqueName: \"kubernetes.io/projected/a9bda1dd-f3d4-41e7-9167-d144e08a951c-kube-api-access-ts7kb\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.800999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9bda1dd-f3d4-41e7-9167-d144e08a951c-tmp-dir\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.800999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-var-lib-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.800999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.800984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-sys\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-netns\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-log-socket\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-bin\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-env-overrides\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cnibin\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-systemd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801198 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-tmp\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-konnectivity-ca\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.801307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-netd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-kubernetes\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801384 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-slash\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-node-log\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-script-lib\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-var-lib-kubelet\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801548 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-88fsd\"" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovn-node-metrics-cert\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-conf\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-run\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/be34bfe8-460c-4d98-b893-fc4b5cf1a081-kube-api-access-q5fjw\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801676 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-lib-modules\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801702 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-system-cni-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.801874 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnk2r\" (UniqueName: \"kubernetes.io/projected/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-kube-api-access-hnk2r\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c24aea3-2431-4f11-b536-892b8dfd1331-iptables-alerter-script\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-modprobe-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-systemd\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-systemd-units\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsp9d\" (UniqueName: \"kubernetes.io/projected/0896d03a-bffd-41a6-83ef-fae8f7e239a7-kube-api-access-rsp9d\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c24aea3-2431-4f11-b536-892b8dfd1331-host-slash\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zh9\" (UniqueName: \"kubernetes.io/projected/0c24aea3-2431-4f11-b536-892b8dfd1331-kube-api-access-l4zh9\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.801997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a9bda1dd-f3d4-41e7-9167-d144e08a951c-hosts-file\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802036 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-config\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-host\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.802734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-agent-certs\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-dbus\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-ovn\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-kubelet-config\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802311 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-kubelet\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-etc-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.803780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.802366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-tuned\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.804201 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.803822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 02:40:34.804201 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.803874 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 02:40:34.804201 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.803987 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9jt4z\"" Apr 21 02:40:34.804201 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.804038 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 02:40:34.836610 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.836570 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:35:33 +0000 UTC" deadline="2027-12-09 13:58:22.695387754 +0000 UTC" Apr 21 02:40:34.836775 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.836621 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14339h17m47.85878698s" Apr 21 02:40:34.890592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.890566 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 02:40:34.902592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.902592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-modprobe-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-systemd\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-netns\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-systemd-units\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-systemd\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsp9d\" (UniqueName: \"kubernetes.io/projected/0896d03a-bffd-41a6-83ef-fae8f7e239a7-kube-api-access-rsp9d\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.902803 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-modprobe-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-bin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-sys-fs\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c24aea3-2431-4f11-b536-892b8dfd1331-host-slash\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zh9\" (UniqueName: \"kubernetes.io/projected/0c24aea3-2431-4f11-b536-892b8dfd1331-kube-api-access-l4zh9\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c24aea3-2431-4f11-b536-892b8dfd1331-host-slash\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-systemd-units\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.902979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a9bda1dd-f3d4-41e7-9167-d144e08a951c-hosts-file\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a9bda1dd-f3d4-41e7-9167-d144e08a951c-hosts-file\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.903143 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.903140 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-d\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-etc-kubernetes\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.903278 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:40:35.403209644 +0000 UTC m=+3.087753905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-config\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-host\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-device-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qzf\" (UniqueName: \"kubernetes.io/projected/a74d283c-e8f6-4c9d-8587-7098f2a65780-kube-api-access-x2qzf\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-agent-certs\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-host\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-dbus\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-ovn\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.903641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-multus\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-dbus\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-kubelet\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-ovn\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903684 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-multus-certs\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-kubelet-config\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-kubelet\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-etc-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7cc43e1f-6f61-404a-ad72-d62ed23cea64-kubelet-config\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-config\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-tuned\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903912 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-kubelet\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysconfig\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-os-release\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysconfig\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.904349 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.903996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7kb\" (UniqueName: \"kubernetes.io/projected/a9bda1dd-f3d4-41e7-9167-d144e08a951c-kube-api-access-ts7kb\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-etc-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8zj8\" (UniqueName: \"kubernetes.io/projected/c2a4d15a-56b4-43a2-b85f-305025a28b5e-kube-api-access-n8zj8\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-system-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-os-release\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-registration-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9bda1dd-f3d4-41e7-9167-d144e08a951c-tmp-dir\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-var-lib-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904334 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-var-lib-openvswitch\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-sys\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-netns\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-log-socket\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-run-netns\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-bin\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-sys\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-env-overrides\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.905222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904546 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-log-socket\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-bin\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9bda1dd-f3d4-41e7-9167-d144e08a951c-tmp-dir\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cnibin\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-conf-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cnibin\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-socket-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-systemd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-run-systemd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-tmp\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904762 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-serviceca\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-konnectivity-ca\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-netd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-kubernetes\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.906000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-slash\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-os-release\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-env-overrides\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-cni-netd\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-socket-dir-parent\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-node-log\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-script-lib\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-var-lib-kubelet\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-k8s-cni-cncf-io\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqq2p\" (UniqueName: \"kubernetes.io/projected/4c53251f-0bae-438f-82b6-956e50adc4eb-kube-api-access-fqq2p\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovn-node-metrics-cert\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-conf\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-run\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/be34bfe8-460c-4d98-b893-fc4b5cf1a081-kube-api-access-q5fjw\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-slash\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.906786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.904983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-kubernetes\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-cnibin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-var-lib-kubelet\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-run\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-sysctl-conf\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-cni-binary-copy\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0896d03a-bffd-41a6-83ef-fae8f7e239a7-node-log\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-lib-modules\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-system-cni-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnk2r\" (UniqueName: \"kubernetes.io/projected/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-kube-api-access-hnk2r\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-hostroot\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-daemon-config\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-host\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9cw\" (UniqueName: \"kubernetes.io/projected/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-kube-api-access-2v9cw\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:34.907342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovnkube-script-lib\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c24aea3-2431-4f11-b536-892b8dfd1331-iptables-alerter-script\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be34bfe8-460c-4d98-b893-fc4b5cf1a081-lib-modules\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905908 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-system-cni-dir\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.905812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-konnectivity-ca\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.906435 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0c24aea3-2431-4f11-b536-892b8dfd1331-iptables-alerter-script\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.907922 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.907605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-etc-tuned\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.908360 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.908342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60d6f338-9195-4f2e-ab8b-d1a92cd1fc22-agent-certs\") pod \"konnectivity-agent-69dnr\" (UID: \"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22\") " pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:34.908556 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.908538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0896d03a-bffd-41a6-83ef-fae8f7e239a7-ovn-node-metrics-cert\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.908864 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.908822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be34bfe8-460c-4d98-b893-fc4b5cf1a081-tmp\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.910321 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.910300 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:34.910321 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.910325 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:34.910453 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.910337 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:34.910453 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:34.910396 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:40:35.410378672 +0000 UTC m=+3.094922931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:34.911495 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.911473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zh9\" (UniqueName: \"kubernetes.io/projected/0c24aea3-2431-4f11-b536-892b8dfd1331-kube-api-access-l4zh9\") pod \"iptables-alerter-wcs7x\" (UID: \"0c24aea3-2431-4f11-b536-892b8dfd1331\") " pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:34.911821 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.911798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsp9d\" (UniqueName: \"kubernetes.io/projected/0896d03a-bffd-41a6-83ef-fae8f7e239a7-kube-api-access-rsp9d\") pod \"ovnkube-node-cpdzb\" (UID: \"0896d03a-bffd-41a6-83ef-fae8f7e239a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:34.912053 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.912034 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7kb\" (UniqueName: \"kubernetes.io/projected/a9bda1dd-f3d4-41e7-9167-d144e08a951c-kube-api-access-ts7kb\") pod \"node-resolver-9mbn4\" (UID: \"a9bda1dd-f3d4-41e7-9167-d144e08a951c\") " pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:34.913047 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.913027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnk2r\" (UniqueName: \"kubernetes.io/projected/773d483d-dfc3-4e6e-b1fa-f8da910c09d0-kube-api-access-hnk2r\") pod \"multus-additional-cni-plugins-kqfw8\" (UID: \"773d483d-dfc3-4e6e-b1fa-f8da910c09d0\") " pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:34.913605 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.913586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/be34bfe8-460c-4d98-b893-fc4b5cf1a081-kube-api-access-q5fjw\") pod \"tuned-rtcq6\" (UID: \"be34bfe8-460c-4d98-b893-fc4b5cf1a081\") " pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:34.943317 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:34.943284 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:35.006165 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.006195 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-k8s-cni-cncf-io\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqq2p\" (UniqueName: \"kubernetes.io/projected/4c53251f-0bae-438f-82b6-956e50adc4eb-kube-api-access-fqq2p\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-cnibin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-k8s-cni-cncf-io\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.006332 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:40:35.506307005 +0000 UTC m=+3.190851249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-cni-binary-copy\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-cnibin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-hostroot\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-daemon-config\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-host\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9cw\" (UniqueName: \"kubernetes.io/projected/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-kube-api-access-2v9cw\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-host\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-netns\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006546 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-hostroot\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-bin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-sys-fs\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-etc-kubernetes\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.006638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-device-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-bin\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qzf\" (UniqueName: \"kubernetes.io/projected/a74d283c-e8f6-4c9d-8587-7098f2a65780-kube-api-access-x2qzf\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-sys-fs\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-multus\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-netns\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-kubelet\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-cni-multus\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-multus-certs\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-var-lib-kubelet\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8zj8\" (UniqueName: \"kubernetes.io/projected/c2a4d15a-56b4-43a2-b85f-305025a28b5e-kube-api-access-n8zj8\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-host-run-multus-certs\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-system-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-registration-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-conf-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-etc-kubernetes\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-registration-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-device-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-system-cni-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-socket-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.006995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-conf-dir\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-serviceca\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-daemon-config\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c53251f-0bae-438f-82b6-956e50adc4eb-cni-binary-copy\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-os-release\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-socket-dir-parent\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-os-release\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a74d283c-e8f6-4c9d-8587-7098f2a65780-socket-dir\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4c53251f-0bae-438f-82b6-956e50adc4eb-multus-socket-dir-parent\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.007794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.007390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-serviceca\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.014936 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.014914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqq2p\" (UniqueName: \"kubernetes.io/projected/4c53251f-0bae-438f-82b6-956e50adc4eb-kube-api-access-fqq2p\") pod \"multus-48qgw\" (UID: \"4c53251f-0bae-438f-82b6-956e50adc4eb\") " pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.015046 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.014917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9cw\" (UniqueName: \"kubernetes.io/projected/d62dffc6-07e2-43c5-929f-e5547bc6cbb9-kube-api-access-2v9cw\") pod \"node-ca-wwdwn\" (UID: \"d62dffc6-07e2-43c5-929f-e5547bc6cbb9\") " pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.015182 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.015159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8zj8\" (UniqueName: \"kubernetes.io/projected/c2a4d15a-56b4-43a2-b85f-305025a28b5e-kube-api-access-n8zj8\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:35.015739 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.015720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qzf\" (UniqueName: \"kubernetes.io/projected/a74d283c-e8f6-4c9d-8587-7098f2a65780-kube-api-access-x2qzf\") pod \"aws-ebs-csi-driver-node-q5n6f\" (UID: \"a74d283c-e8f6-4c9d-8587-7098f2a65780\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.052389 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.052365 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 02:40:35.085037 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.085005 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:35.095919 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.095889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wcs7x" Apr 21 02:40:35.103637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.103613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" Apr 21 02:40:35.108274 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.108252 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" Apr 21 02:40:35.115667 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.115648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:35.121172 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.121151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9mbn4" Apr 21 02:40:35.126681 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.126661 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-48qgw" Apr 21 02:40:35.133183 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.133163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wwdwn" Apr 21 02:40:35.138690 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.138673 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" Apr 21 02:40:35.410637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.410552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:35.410637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.410601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:35.410844 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.410730 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:35.410844 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.410770 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:35.410844 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.410788 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:35.411015 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.410989 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:35.411159 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.411140 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:40:36.411106204 +0000 UTC m=+4.095650459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:35.411263 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.411236 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:40:36.411198792 +0000 UTC m=+4.095743046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:35.510960 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.510922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:35.511150 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.511097 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:35.511214 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:35.511179 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:40:36.511157915 +0000 UTC m=+4.195702168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:35.603659 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.603636 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod773d483d_dfc3_4e6e_b1fa_f8da910c09d0.slice/crio-c692e6a5d362439f40ddc912f3ba451e50807c727f870c28e8ec345903a5525b WatchSource:0}: Error finding container c692e6a5d362439f40ddc912f3ba451e50807c727f870c28e8ec345903a5525b: Status 404 returned error can't find the container with id c692e6a5d362439f40ddc912f3ba451e50807c727f870c28e8ec345903a5525b Apr 21 02:40:35.607711 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.607683 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c24aea3_2431_4f11_b536_892b8dfd1331.slice/crio-8c6968db7d139f7c320d1ff5236d0bf9d499120935bb5df240903dc820876bfc WatchSource:0}: Error finding container 8c6968db7d139f7c320d1ff5236d0bf9d499120935bb5df240903dc820876bfc: Status 404 returned error can't find the container with id 8c6968db7d139f7c320d1ff5236d0bf9d499120935bb5df240903dc820876bfc Apr 21 02:40:35.608619 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.608594 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c53251f_0bae_438f_82b6_956e50adc4eb.slice/crio-3a6d58eeb75e683f9cb4e8f75d6a62417c5835b8dd2acc8b54bc747662aa2c9c WatchSource:0}: Error finding container 3a6d58eeb75e683f9cb4e8f75d6a62417c5835b8dd2acc8b54bc747662aa2c9c: Status 404 returned error can't find the container with id 3a6d58eeb75e683f9cb4e8f75d6a62417c5835b8dd2acc8b54bc747662aa2c9c Apr 21 02:40:35.609375 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.609340 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d6f338_9195_4f2e_ab8b_d1a92cd1fc22.slice/crio-2e4829ced9ddff050f378731cc6c952531f1fb02756bee12b8d31bed256d6691 WatchSource:0}: Error finding container 2e4829ced9ddff050f378731cc6c952531f1fb02756bee12b8d31bed256d6691: Status 404 returned error can't find the container with id 2e4829ced9ddff050f378731cc6c952531f1fb02756bee12b8d31bed256d6691 Apr 21 02:40:35.610038 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.610021 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0896d03a_bffd_41a6_83ef_fae8f7e239a7.slice/crio-3d4d56de5d8ce2562d0e153af9d253c216ff347e1f3181bb3cd5c36546c0cb4a WatchSource:0}: Error finding container 3d4d56de5d8ce2562d0e153af9d253c216ff347e1f3181bb3cd5c36546c0cb4a: Status 404 returned error can't find the container with id 3d4d56de5d8ce2562d0e153af9d253c216ff347e1f3181bb3cd5c36546c0cb4a Apr 21 02:40:35.631147 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.631119 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9bda1dd_f3d4_41e7_9167_d144e08a951c.slice/crio-f0d776e83fd58e40de0fe6b409f014dacc2ee6aac91ebb6607e11cac2d70ba49 WatchSource:0}: Error finding container f0d776e83fd58e40de0fe6b409f014dacc2ee6aac91ebb6607e11cac2d70ba49: Status 404 returned error can't find the container with id f0d776e83fd58e40de0fe6b409f014dacc2ee6aac91ebb6607e11cac2d70ba49 Apr 21 02:40:35.632321 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.632287 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74d283c_e8f6_4c9d_8587_7098f2a65780.slice/crio-82708a1063b5ca0e32427a820c41fec439b67d59eed475468c82a2a63f273a3b WatchSource:0}: Error finding container 82708a1063b5ca0e32427a820c41fec439b67d59eed475468c82a2a63f273a3b: Status 404 returned error can't find the container with id 82708a1063b5ca0e32427a820c41fec439b67d59eed475468c82a2a63f273a3b Apr 21 02:40:35.633571 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:40:35.633556 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe34bfe8_460c_4d98_b893_fc4b5cf1a081.slice/crio-97ee2a072251891a03c88438c73271eb5037e322728d9e6748955de4dce7b59d WatchSource:0}: Error finding container 97ee2a072251891a03c88438c73271eb5037e322728d9e6748955de4dce7b59d: Status 404 returned error can't find the container with id 97ee2a072251891a03c88438c73271eb5037e322728d9e6748955de4dce7b59d Apr 21 02:40:35.837180 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.837009 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 02:35:33 +0000 UTC" deadline="2027-10-21 08:34:54.07782005 +0000 UTC" Apr 21 02:40:35.837180 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.837174 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13157h54m18.240648776s" Apr 21 02:40:35.913480 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.913427 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerStarted","Data":"c692e6a5d362439f40ddc912f3ba451e50807c727f870c28e8ec345903a5525b"} Apr 21 02:40:35.915734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.915671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" event={"ID":"be812a68c721006fc84b840bb8d76277","Type":"ContainerStarted","Data":"18cd19cf56b4f0048b34870e1bfbe1519c025349d620420827e586e5ac988ca3"} Apr 21 02:40:35.917226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.917088 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" event={"ID":"be34bfe8-460c-4d98-b893-fc4b5cf1a081","Type":"ContainerStarted","Data":"97ee2a072251891a03c88438c73271eb5037e322728d9e6748955de4dce7b59d"} Apr 21 02:40:35.918627 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.918566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" event={"ID":"a74d283c-e8f6-4c9d-8587-7098f2a65780","Type":"ContainerStarted","Data":"82708a1063b5ca0e32427a820c41fec439b67d59eed475468c82a2a63f273a3b"} Apr 21 02:40:35.920489 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.920453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9mbn4" event={"ID":"a9bda1dd-f3d4-41e7-9167-d144e08a951c","Type":"ContainerStarted","Data":"f0d776e83fd58e40de0fe6b409f014dacc2ee6aac91ebb6607e11cac2d70ba49"} Apr 21 02:40:35.921929 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.921889 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"3d4d56de5d8ce2562d0e153af9d253c216ff347e1f3181bb3cd5c36546c0cb4a"} Apr 21 02:40:35.923241 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.923183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-48qgw" event={"ID":"4c53251f-0bae-438f-82b6-956e50adc4eb","Type":"ContainerStarted","Data":"3a6d58eeb75e683f9cb4e8f75d6a62417c5835b8dd2acc8b54bc747662aa2c9c"} Apr 21 02:40:35.924612 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.924565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wwdwn" event={"ID":"d62dffc6-07e2-43c5-929f-e5547bc6cbb9","Type":"ContainerStarted","Data":"bbb9a81342f2a5ea28df1f654c13916a1ef810015b50442b345e745fc2cd49c3"} Apr 21 02:40:35.926318 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.926274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-69dnr" event={"ID":"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22","Type":"ContainerStarted","Data":"2e4829ced9ddff050f378731cc6c952531f1fb02756bee12b8d31bed256d6691"} Apr 21 02:40:35.927487 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.927438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wcs7x" event={"ID":"0c24aea3-2431-4f11-b536-892b8dfd1331","Type":"ContainerStarted","Data":"8c6968db7d139f7c320d1ff5236d0bf9d499120935bb5df240903dc820876bfc"} Apr 21 02:40:35.928251 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:35.928130 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-147.ec2.internal" podStartSLOduration=1.928113617 podStartE2EDuration="1.928113617s" podCreationTimestamp="2026-04-21 02:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:40:35.927623324 +0000 UTC m=+3.612167585" watchObservedRunningTime="2026-04-21 02:40:35.928113617 +0000 UTC m=+3.612657877" Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.418474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.418570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.418685 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.418745 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:40:38.418728148 +0000 UTC m=+6.103272392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.419136 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.419168 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.419181 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:36.419262 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.419226 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:40:38.419211366 +0000 UTC m=+6.103755619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:36.519903 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.519865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:36.520224 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.520203 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:36.520325 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.520286 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:40:38.52026518 +0000 UTC m=+6.204809438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:36.905480 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.905391 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:36.905839 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.905503 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:36.905839 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.905616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:36.905839 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.905685 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:36.905839 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:36.905767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:36.906087 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:36.905847 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:37.977937 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:37.977877 2572 generic.go:358] "Generic (PLEG): container finished" podID="b571a1977453700e90c1fa2adc7c4324" containerID="f81514628cf6af7babc5c59a7c758d905244078e127d7675aa4c76c65328b95f" exitCode=0 Apr 21 02:40:37.977937 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:37.977935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" event={"ID":"b571a1977453700e90c1fa2adc7c4324","Type":"ContainerDied","Data":"f81514628cf6af7babc5c59a7c758d905244078e127d7675aa4c76c65328b95f"} Apr 21 02:40:38.435360 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.435311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:38.435566 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.435387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:38.435630 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.435620 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:38.436073 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.435685 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:40:42.435666172 +0000 UTC m=+10.120210417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:38.436398 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.436270 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:38.436398 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.436300 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:38.436398 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.436315 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:38.436398 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.436371 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:40:42.436354093 +0000 UTC m=+10.120898350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:38.536326 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.536285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:38.536501 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.536464 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:38.536647 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.536593 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:40:42.536511035 +0000 UTC m=+10.221055275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:38.906500 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.905804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:38.906500 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.905930 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:38.906500 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.906377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:38.906500 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.906464 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:38.906845 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:38.906587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:38.906845 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:38.906672 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:40.906213 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:40.905548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:40.906213 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:40.905645 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:40.906213 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:40.906010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:40.906213 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:40.906098 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:40.907108 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:40.906774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:40.907108 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:40.906841 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:42.471838 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.471712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:42.471838 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.471794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.471870 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.471904 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.471918 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.471979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:40:50.471961592 +0000 UTC m=+18.156505855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.471870 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:42.472352 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.472353 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:40:50.472337147 +0000 UTC m=+18.156881402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:42.573256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.572643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:42.573256 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.572840 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:42.573256 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.572904 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:40:50.572885084 +0000 UTC m=+18.257429344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:42.904537 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.904491 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:42.904693 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.904671 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:42.904751 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.904696 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:42.904824 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.904801 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:42.904873 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:42.904853 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:42.904933 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:42.904917 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:44.905121 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:44.905043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:44.905121 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:44.905057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:44.905640 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:44.905043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:44.905640 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:44.905178 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:44.905640 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:44.905250 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:44.905640 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:44.905318 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:46.904347 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:46.904305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:46.904757 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:46.904305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:46.904757 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:46.904444 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:46.904757 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:46.904305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:46.904757 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:46.904541 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:46.904757 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:46.904632 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:48.905402 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:48.905336 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:48.905402 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:48.905384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:48.905946 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:48.905459 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:48.905946 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:48.905531 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:48.905946 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:48.905575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:48.905946 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:48.905638 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:50.533725 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.533684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.533747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533830 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533842 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533857 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533868 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533898 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:06.533880381 +0000 UTC m=+34.218424631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:40:50.534186 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.533917 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:41:06.533907529 +0000 UTC m=+34.218451776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:40:50.635037 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.634994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:50.635197 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.635145 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:50.635244 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.635207 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:06.63519011 +0000 UTC m=+34.319734360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:40:50.904466 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.904371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:50.904466 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.904452 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:50.904704 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:50.904493 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:50.904704 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.904604 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:50.904704 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.904674 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:50.904834 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:50.904742 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:52.905464 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:52.905434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:52.905827 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:52.905512 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:52.905827 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:52.905595 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:52.905827 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:52.905684 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:52.905827 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:52.905722 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:52.905827 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:52.905776 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:54.008915 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.008473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-48qgw" event={"ID":"4c53251f-0bae-438f-82b6-956e50adc4eb","Type":"ContainerStarted","Data":"738f357ad460bc5ccd9c13f35eca61fdae0df987e5980ee929113df0f42f7933"} Apr 21 02:40:54.010059 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.010028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wwdwn" event={"ID":"d62dffc6-07e2-43c5-929f-e5547bc6cbb9","Type":"ContainerStarted","Data":"e4c880208e8305bf7146345a9ac83ab1761f47ca18692532464b8980206c24f7"} Apr 21 02:40:54.011548 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.011395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-69dnr" event={"ID":"60d6f338-9195-4f2e-ab8b-d1a92cd1fc22","Type":"ContainerStarted","Data":"bfb5ad3d5871e8896a9a2a0b5190bfbb4e1cab49c27dd9f4ebe2e3e203167f2c"} Apr 21 02:40:54.012789 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.012766 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="060e04d0dd2c3fa637d2adb676e9547f3bb22dbeeda8502c2ac4e08f0033bcbd" exitCode=0 Apr 21 02:40:54.012896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.012841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"060e04d0dd2c3fa637d2adb676e9547f3bb22dbeeda8502c2ac4e08f0033bcbd"} Apr 21 02:40:54.014601 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.014580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" event={"ID":"b571a1977453700e90c1fa2adc7c4324","Type":"ContainerStarted","Data":"377785c1dc8925e2e78b3671ae75057dbc61d7f0c08f0c6c9e665ee5ebd60757"} Apr 21 02:40:54.016401 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.016047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" event={"ID":"be34bfe8-460c-4d98-b893-fc4b5cf1a081","Type":"ContainerStarted","Data":"b93e0287aef98d1a3b04c68ff218a10f52b7f9f19ac7567937c7399acd9cce94"} Apr 21 02:40:54.017759 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.017731 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" event={"ID":"a74d283c-e8f6-4c9d-8587-7098f2a65780","Type":"ContainerStarted","Data":"613bcb959b33838cd0a1189a8557c18dc7ebeeaf43888a0811bda5938978f8e0"} Apr 21 02:40:54.019080 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.019032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9mbn4" event={"ID":"a9bda1dd-f3d4-41e7-9167-d144e08a951c","Type":"ContainerStarted","Data":"e297f920e71a210404273032845a44efc4a10dcbf390ad2186a7ba6e34c61630"} Apr 21 02:40:54.022644 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022601 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-48qgw" podStartSLOduration=3.7179594590000002 podStartE2EDuration="21.022566101s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.629905941 +0000 UTC m=+3.314450179" lastFinishedPulling="2026-04-21 02:40:52.934512563 +0000 UTC m=+20.619056821" observedRunningTime="2026-04-21 02:40:54.022426747 +0000 UTC m=+21.706971009" watchObservedRunningTime="2026-04-21 02:40:54.022566101 +0000 UTC m=+21.707110365" Apr 21 02:40:54.022745 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"22e860075cc1ff05c86ab4285e6391907a7a8f4a66b017857315234ee6e2490a"} Apr 21 02:40:54.022745 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"0a7344ea2d5ebc69701cc9e519f4b17a830b257a773abb2ffcdf77f91d1b8b09"} Apr 21 02:40:54.022745 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"dbbfe91d94600a548c7184a7c134eef05440f760716ebb729f1a10cd07f42ec8"} Apr 21 02:40:54.022745 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"d47eb14c93f42b2142f5d6fd3fe9bcf3463aabd3eb0a52751f7a9dd051124f49"} Apr 21 02:40:54.022745 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"e8022fa77607e7bf45619b1a0f9cb0cd2c993d8a540d02835cab7c845bdd6748"} Apr 21 02:40:54.022978 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.022753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"837a7cb2a5b92a6c560d4507f14e742b01bc7d2dfdfdf114983f534989d10f12"} Apr 21 02:40:54.046514 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.046466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9mbn4" podStartSLOduration=3.766436808 podStartE2EDuration="21.046449417s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.636614506 +0000 UTC m=+3.321158758" lastFinishedPulling="2026-04-21 02:40:52.91662713 +0000 UTC m=+20.601171367" observedRunningTime="2026-04-21 02:40:54.045790665 +0000 UTC m=+21.730334935" watchObservedRunningTime="2026-04-21 02:40:54.046449417 +0000 UTC m=+21.730993678" Apr 21 02:40:54.055554 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.055506 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wwdwn" podStartSLOduration=11.965190487 podStartE2EDuration="21.055494681s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.604497267 +0000 UTC m=+3.289041524" lastFinishedPulling="2026-04-21 02:40:44.694801477 +0000 UTC m=+12.379345718" observedRunningTime="2026-04-21 02:40:54.055491251 +0000 UTC m=+21.740035513" watchObservedRunningTime="2026-04-21 02:40:54.055494681 +0000 UTC m=+21.740038935" Apr 21 02:40:54.068181 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.066130 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-69dnr" podStartSLOduration=4.779662554 podStartE2EDuration="22.066118674s" podCreationTimestamp="2026-04-21 02:40:32 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.630267476 +0000 UTC m=+3.314811722" lastFinishedPulling="2026-04-21 02:40:52.916723594 +0000 UTC m=+20.601267842" observedRunningTime="2026-04-21 02:40:54.065247177 +0000 UTC m=+21.749791438" watchObservedRunningTime="2026-04-21 02:40:54.066118674 +0000 UTC m=+21.750662937" Apr 21 02:40:54.075075 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.075049 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 02:40:54.081234 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.081201 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rtcq6" podStartSLOduration=3.799381107 podStartE2EDuration="21.08119076s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.636953554 +0000 UTC m=+3.321497798" lastFinishedPulling="2026-04-21 02:40:52.918763198 +0000 UTC m=+20.603307451" observedRunningTime="2026-04-21 02:40:54.080706676 +0000 UTC m=+21.765250935" watchObservedRunningTime="2026-04-21 02:40:54.08119076 +0000 UTC m=+21.765735020" Apr 21 02:40:54.094689 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.094650 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-147.ec2.internal" podStartSLOduration=20.094639208 podStartE2EDuration="20.094639208s" podCreationTimestamp="2026-04-21 02:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:40:54.094620256 +0000 UTC m=+21.779164516" watchObservedRunningTime="2026-04-21 02:40:54.094639208 +0000 UTC m=+21.779183468" Apr 21 02:40:54.878554 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.878418 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T02:40:54.075068578Z","UUID":"b6f0b8a7-673f-4744-b3cd-ca9533f30cb9","Handler":null,"Name":"","Endpoint":""} Apr 21 02:40:54.880968 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.880944 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 02:40:54.881108 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.880979 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 02:40:54.904302 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.904275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:54.904464 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.904324 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:54.904464 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:54.904411 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:54.904606 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:54.904465 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:54.904606 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:54.904487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:54.904606 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:54.904584 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:55.026713 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.026675 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" event={"ID":"a74d283c-e8f6-4c9d-8587-7098f2a65780","Type":"ContainerStarted","Data":"3214b67568ff26ed01f3a1e77f8638fe70f10a0fb57364afea90cc0ab67a0ddb"} Apr 21 02:40:55.026713 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.026716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" event={"ID":"a74d283c-e8f6-4c9d-8587-7098f2a65780","Type":"ContainerStarted","Data":"01e9cf0f57644065df81d9ecac3274b36a943da5f2442563d94c67e367808155"} Apr 21 02:40:55.028183 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.028155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wcs7x" event={"ID":"0c24aea3-2431-4f11-b536-892b8dfd1331","Type":"ContainerStarted","Data":"abb66bdede64fc869acdc89f4abe60ab294067ee951cc200555edc6dc8c52cdb"} Apr 21 02:40:55.060665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.060605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5n6f" podStartSLOduration=2.871953742 podStartE2EDuration="22.060587133s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.636839702 +0000 UTC m=+3.321383943" lastFinishedPulling="2026-04-21 02:40:54.82547309 +0000 UTC m=+22.510017334" observedRunningTime="2026-04-21 02:40:55.048249234 +0000 UTC m=+22.732793494" watchObservedRunningTime="2026-04-21 02:40:55.060587133 +0000 UTC m=+22.745131395" Apr 21 02:40:55.061322 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.061286 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wcs7x" podStartSLOduration=4.774706486 podStartE2EDuration="22.061278498s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.62990555 +0000 UTC m=+3.314449806" lastFinishedPulling="2026-04-21 02:40:52.916477573 +0000 UTC m=+20.601021818" observedRunningTime="2026-04-21 02:40:55.060308053 +0000 UTC m=+22.744852310" watchObservedRunningTime="2026-04-21 02:40:55.061278498 +0000 UTC m=+22.745822760" Apr 21 02:40:55.972081 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:55.972050 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:56.033566 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:56.033315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"27b13e834189a83b29de26c1330c5d7c78c38b16e14b8ed9d27fbec9b615ff6e"} Apr 21 02:40:56.908556 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:56.908506 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:56.908754 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:56.908508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:56.908754 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:56.908623 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:56.908754 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:56.908516 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:56.908754 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:56.908699 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:56.908918 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:56.908792 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:57.214502 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:57.214428 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:57.215208 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:57.215186 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:58.040934 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.040859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" event={"ID":"0896d03a-bffd-41a6-83ef-fae8f7e239a7","Type":"ContainerStarted","Data":"520427f03261304c9a3b0f6a8c8256206fdad207659f3ad985033d457d4024d5"} Apr 21 02:40:58.041332 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.041307 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:58.041915 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.041763 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-69dnr" Apr 21 02:40:58.056018 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.055999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:58.063769 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.063737 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" podStartSLOduration=7.333406319 podStartE2EDuration="25.063723875s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.629827068 +0000 UTC m=+3.314371310" lastFinishedPulling="2026-04-21 02:40:53.360144608 +0000 UTC m=+21.044688866" observedRunningTime="2026-04-21 02:40:58.063699557 +0000 UTC m=+25.748243820" watchObservedRunningTime="2026-04-21 02:40:58.063723875 +0000 UTC m=+25.748268134" Apr 21 02:40:58.904612 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.904394 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:58.905225 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.904418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:58.905225 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:58.904714 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:58.905225 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:58.904456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:58.905225 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:58.904751 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:58.905225 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:58.904824 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:40:59.044485 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.044447 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="0230f439fab67caabc09f6e272c0a6716a1586fe6d978571aeb42777162f4bda" exitCode=0 Apr 21 02:40:59.044639 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.044550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"0230f439fab67caabc09f6e272c0a6716a1586fe6d978571aeb42777162f4bda"} Apr 21 02:40:59.044844 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.044829 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 02:40:59.045775 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.045306 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:59.059377 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.059355 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:40:59.943328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.943298 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xlxwj"] Apr 21 02:40:59.943665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.943404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:40:59.943665 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:59.943492 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:40:59.946643 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.946620 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzdk8"] Apr 21 02:40:59.946753 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.946701 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:40:59.946815 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:59.946791 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:40:59.947497 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.947391 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cbjx7"] Apr 21 02:40:59.947497 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:40:59.947468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:40:59.947611 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:40:59.947571 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:41:00.048201 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:00.048163 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="e983c9ac90fdf7fa77409e7232ffd73ff24d8f80169920af6a1e577c8a09e4dd" exitCode=0 Apr 21 02:41:00.048343 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:00.048256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"e983c9ac90fdf7fa77409e7232ffd73ff24d8f80169920af6a1e577c8a09e4dd"} Apr 21 02:41:00.048446 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:00.048432 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 02:41:01.051888 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.051799 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="524e5fea8a2aa7b25c50c3732af086804463535651a1a7c43eb2337b6e2920f3" exitCode=0 Apr 21 02:41:01.052224 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.051888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"524e5fea8a2aa7b25c50c3732af086804463535651a1a7c43eb2337b6e2920f3"} Apr 21 02:41:01.052224 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.051991 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 02:41:01.904768 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.904729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:01.904919 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.904729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:01.904919 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:01.904860 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:41:01.904919 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:01.904729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:01.904919 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:01.904914 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:41:01.905075 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:01.905001 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:41:02.977537 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:02.977289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:41:02.977971 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:02.977748 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 02:41:02.989458 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:02.989401 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" podUID="0896d03a-bffd-41a6-83ef-fae8f7e239a7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 21 02:41:03.000060 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:03.000025 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" podUID="0896d03a-bffd-41a6-83ef-fae8f7e239a7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 21 02:41:03.904887 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:03.904857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:03.905056 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:03.904903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:03.905056 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:03.904858 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:03.905056 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:03.904965 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:41:03.905056 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:03.905046 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:41:03.905250 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:03.905133 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:41:05.904568 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:05.904482 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:05.904568 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:05.904489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:05.904568 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:05.904489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:05.905057 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:05.904595 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cbjx7" podUID="5d35cb86-19fa-41be-9ae2-d70d8dbe564d" Apr 21 02:41:05.905057 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:05.904695 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:41:05.905057 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:05.904779 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xlxwj" podUID="7cc43e1f-6f61-404a-ad72-d62ed23cea64" Apr 21 02:41:06.098969 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.098939 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-147.ec2.internal" event="NodeReady" Apr 21 02:41:06.099144 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.099074 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 02:41:06.129581 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.129551 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx"] Apr 21 02:41:06.168107 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.168026 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk"] Apr 21 02:41:06.168268 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.168164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.170787 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.170750 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 02:41:06.170924 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.170796 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-w7tmk\"" Apr 21 02:41:06.171087 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.171063 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 02:41:06.171221 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.171202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 02:41:06.171303 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.171223 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 02:41:06.184259 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.184236 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh"] Apr 21 02:41:06.184563 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.184514 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.186807 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.186763 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 02:41:06.186924 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.186813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 02:41:06.186924 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.186833 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 02:41:06.186924 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.186842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 02:41:06.206567 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.206462 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:41:06.206685 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.206586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.208596 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.208577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 02:41:06.231224 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.231203 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx"] Apr 21 02:41:06.231224 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.231226 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh"] Apr 21 02:41:06.231400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.231235 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk"] Apr 21 02:41:06.231400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.231245 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zn5hz"] Apr 21 02:41:06.231400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.231347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.233718 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.233548 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlcjx\"" Apr 21 02:41:06.233718 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.233577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 02:41:06.233718 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.233611 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 02:41:06.233718 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.233585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 02:41:06.239649 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.239628 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 02:41:06.256447 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.256414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zn5hz"] Apr 21 02:41:06.256604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.256454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:41:06.256604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.256482 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4c7td"] Apr 21 02:41:06.256714 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.256678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.258602 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.258582 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:41:06.258940 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.258919 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 02:41:06.259128 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.259107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 02:41:06.263650 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.263475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnph7\" (UniqueName: \"kubernetes.io/projected/eab79a67-eb0a-4e47-b634-7047d26e3904-kube-api-access-dnph7\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.263650 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.263567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eab79a67-eb0a-4e47-b634-7047d26e3904-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.267399 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.267381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 02:41:06.278177 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.278157 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4c7td"] Apr 21 02:41:06.278311 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.278285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.280200 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.280178 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 02:41:06.280326 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.280308 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:41:06.280402 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.280385 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 02:41:06.364583 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364732 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzh8w\" (UniqueName: \"kubernetes.io/projected/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-kube-api-access-lzh8w\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hh8\" (UniqueName: \"kubernetes.io/projected/92b14238-0b93-4982-a17d-81055b501a7b-kube-api-access-d9hh8\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.364828 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eab79a67-eb0a-4e47-b634-7047d26e3904-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/92b14238-0b93-4982-a17d-81055b501a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.364974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84cm\" (UniqueName: \"kubernetes.io/projected/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-kube-api-access-z84cm\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxnpq\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnph7\" (UniqueName: \"kubernetes.io/projected/eab79a67-eb0a-4e47-b634-7047d26e3904-kube-api-access-dnph7\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.365171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.365495 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.365495 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.365238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92b14238-0b93-4982-a17d-81055b501a7b-tmp\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.369636 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.369605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eab79a67-eb0a-4e47-b634-7047d26e3904-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.372423 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.372392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnph7\" (UniqueName: \"kubernetes.io/projected/eab79a67-eb0a-4e47-b634-7047d26e3904-kube-api-access-dnph7\") pod \"managed-serviceaccount-addon-agent-7c777784f9-mqwnx\" (UID: \"eab79a67-eb0a-4e47-b634-7047d26e3904\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.466079 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.465993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466079 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.466079 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.466358 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22cwt\" (UniqueName: \"kubernetes.io/projected/e8ab2f22-c931-49de-80fd-45193fa7eda9-kube-api-access-22cwt\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.466358 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z84cm\" (UniqueName: \"kubernetes.io/projected/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-kube-api-access-z84cm\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.466358 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.466194 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:06.466358 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.466283 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:06.966259689 +0000 UTC m=+34.650803939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:06.466511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.466511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxnpq\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466671 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.466671 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92b14238-0b93-4982-a17d-81055b501a7b-tmp\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.466671 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.466671 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466671 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ab2f22-c931-49de-80fd-45193fa7eda9-tmp-dir\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466810 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzh8w\" (UniqueName: \"kubernetes.io/projected/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-kube-api-access-lzh8w\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.466860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hh8\" (UniqueName: \"kubernetes.io/projected/92b14238-0b93-4982-a17d-81055b501a7b-kube-api-access-d9hh8\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.467034 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ab2f22-c931-49de-80fd-45193fa7eda9-config-volume\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.467034 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/92b14238-0b93-4982-a17d-81055b501a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.467034 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.466934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.467120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.467036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/92b14238-0b93-4982-a17d-81055b501a7b-tmp\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.467120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.467093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.467438 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.467394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.467604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.467584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.467955 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.467826 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:06.467955 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.467848 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:06.467955 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.467910 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:06.967890969 +0000 UTC m=+34.652435211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:06.468626 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.468434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.469394 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.469348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.469580 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.469543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.469967 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.469940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-ca\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.470331 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.470290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.470331 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.470314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.470774 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.470756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-hub\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.471541 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.471500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/92b14238-0b93-4982-a17d-81055b501a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.476773 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.476713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84cm\" (UniqueName: \"kubernetes.io/projected/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-kube-api-access-z84cm\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.477130 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.477106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hh8\" (UniqueName: \"kubernetes.io/projected/92b14238-0b93-4982-a17d-81055b501a7b-kube-api-access-d9hh8\") pod \"klusterlet-addon-workmgr-cdc6df59c-qntjh\" (UID: \"92b14238-0b93-4982-a17d-81055b501a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.477268 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.477230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.478258 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.478241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxnpq\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.479583 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.479562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzh8w\" (UniqueName: \"kubernetes.io/projected/9aa02d10-4b83-4818-ab56-8c51e74f5f5d-kube-api-access-lzh8w\") pod \"cluster-proxy-proxy-agent-5454cdfc-wgbjk\" (UID: \"9aa02d10-4b83-4818-ab56-8c51e74f5f5d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.488269 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.488249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" Apr 21 02:41:06.497025 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.497008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:41:06.526665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.526636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:06.567230 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ab2f22-c931-49de-80fd-45193fa7eda9-config-volume\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.567399 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.567399 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22cwt\" (UniqueName: \"kubernetes.io/projected/e8ab2f22-c931-49de-80fd-45193fa7eda9-kube-api-access-22cwt\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.567399 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.567720 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ab2f22-c931-49de-80fd-45193fa7eda9-tmp-dir\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.567871 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.567916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.567962 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret podName:7cc43e1f-6f61-404a-ad72-d62ed23cea64 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.567933417 +0000 UTC m=+66.252477670 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret") pod "global-pull-secret-syncer-xlxwj" (UID: "7cc43e1f-6f61-404a-ad72-d62ed23cea64") : object "kube-system"/"original-pull-secret" not registered Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.568047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ab2f22-c931-49de-80fd-45193fa7eda9-config-volume\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.568053 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.568117 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.568134 2572 projected.go:194] Error preparing data for projected volume kube-api-access-dcls5 for pod openshift-network-diagnostics/network-check-target-cbjx7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.568132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ab2f22-c931-49de-80fd-45193fa7eda9-tmp-dir\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.568325 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5 podName:5d35cb86-19fa-41be-9ae2-d70d8dbe564d nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.568309041 +0000 UTC m=+66.252853293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dcls5" (UniqueName: "kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5") pod "network-check-target-cbjx7" (UID: "5d35cb86-19fa-41be-9ae2-d70d8dbe564d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 02:41:06.568574 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.568432 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:07.06840591 +0000 UTC m=+34.752950198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:06.574430 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.574406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22cwt\" (UniqueName: \"kubernetes.io/projected/e8ab2f22-c931-49de-80fd-45193fa7eda9-kube-api-access-22cwt\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:06.668983 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.668899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:06.669234 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.669039 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:06.669234 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.669106 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.669091071 +0000 UTC m=+66.353635309 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 02:41:06.803569 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.803515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx"] Apr 21 02:41:06.809260 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.809237 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk"] Apr 21 02:41:06.813230 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.813206 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh"] Apr 21 02:41:06.946040 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:41:06.945990 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab79a67_eb0a_4e47_b634_7047d26e3904.slice/crio-e6e675a16f3a9304fa6f19e2f1bd1a9c948064df9fde57479a9c29965e5db938 WatchSource:0}: Error finding container e6e675a16f3a9304fa6f19e2f1bd1a9c948064df9fde57479a9c29965e5db938: Status 404 returned error can't find the container with id e6e675a16f3a9304fa6f19e2f1bd1a9c948064df9fde57479a9c29965e5db938 Apr 21 02:41:06.946873 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:41:06.946464 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa02d10_4b83_4818_ab56_8c51e74f5f5d.slice/crio-1111603938dbdb2bb91785bc371d32046e92712252d807f96b55738879e2811b WatchSource:0}: Error finding container 1111603938dbdb2bb91785bc371d32046e92712252d807f96b55738879e2811b: Status 404 returned error can't find the container with id 1111603938dbdb2bb91785bc371d32046e92712252d807f96b55738879e2811b Apr 21 02:41:06.947132 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:41:06.947099 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b14238_0b93_4982_a17d_81055b501a7b.slice/crio-a3e1582b674eb2f645c4b15497943af6cd89fda69bf2aae2ffe576b9f1acd9d6 WatchSource:0}: Error finding container a3e1582b674eb2f645c4b15497943af6cd89fda69bf2aae2ffe576b9f1acd9d6: Status 404 returned error can't find the container with id a3e1582b674eb2f645c4b15497943af6cd89fda69bf2aae2ffe576b9f1acd9d6 Apr 21 02:41:06.971914 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.971888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:06.972011 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:06.971996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:06.972051 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.972018 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:06.972087 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.972065 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:07.972050673 +0000 UTC m=+35.656594911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:06.972133 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.972121 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:06.972167 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.972137 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:06.972199 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:06.972194 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:07.972178233 +0000 UTC m=+35.656722480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:07.064683 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.064649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" event={"ID":"eab79a67-eb0a-4e47-b634-7047d26e3904","Type":"ContainerStarted","Data":"e6e675a16f3a9304fa6f19e2f1bd1a9c948064df9fde57479a9c29965e5db938"} Apr 21 02:41:07.065649 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.065616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" event={"ID":"92b14238-0b93-4982-a17d-81055b501a7b","Type":"ContainerStarted","Data":"a3e1582b674eb2f645c4b15497943af6cd89fda69bf2aae2ffe576b9f1acd9d6"} Apr 21 02:41:07.066705 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.066684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerStarted","Data":"1111603938dbdb2bb91785bc371d32046e92712252d807f96b55738879e2811b"} Apr 21 02:41:07.073273 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.073109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:07.073370 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.073299 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:07.073370 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.073361 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:08.073339751 +0000 UTC m=+35.757884005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:07.905690 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.905278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:07.905950 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.905403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:07.906448 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.905460 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:07.908435 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.908414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nsvjd\"" Apr 21 02:41:07.909240 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.909219 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 02:41:07.909330 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.909284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 02:41:07.909392 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.909230 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 02:41:07.909504 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.909487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 02:41:07.909613 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.909595 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:41:07.982574 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.982541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.982637 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:07.982649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.982659 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.982724 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.982735 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:09.982716032 +0000 UTC m=+37.667260284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:07.983476 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:07.982779 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:09.982759695 +0000 UTC m=+37.667303948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:08.076842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:08.076806 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="4c8a8172434ecc25641af15d1cc63e125da0f442ebf042c6a979cac587fa0ffd" exitCode=0 Apr 21 02:41:08.076997 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:08.076858 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"4c8a8172434ecc25641af15d1cc63e125da0f442ebf042c6a979cac587fa0ffd"} Apr 21 02:41:08.084062 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:08.084008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:08.084253 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:08.084157 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:08.084253 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:08.084210 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:10.084192578 +0000 UTC m=+37.768736817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:09.084775 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:09.084739 2572 generic.go:358] "Generic (PLEG): container finished" podID="773d483d-dfc3-4e6e-b1fa-f8da910c09d0" containerID="38788dca94730078682d58c5f4655e40339e966f59618c23f074c9a9e915ad81" exitCode=0 Apr 21 02:41:09.085212 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:09.084800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerDied","Data":"38788dca94730078682d58c5f4655e40339e966f59618c23f074c9a9e915ad81"} Apr 21 02:41:10.004286 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:10.004247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:10.004465 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:10.004332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:10.004548 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.004484 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:10.004611 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.004563 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:14.004543617 +0000 UTC m=+41.689087866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:10.004683 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.004652 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:10.004683 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.004665 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:10.004773 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.004704 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:14.004691954 +0000 UTC m=+41.689236200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:10.105480 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:10.105397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:10.105858 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.105581 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:10.105858 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:10.105650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:14.105628297 +0000 UTC m=+41.790172548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:13.093750 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:13.093657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" event={"ID":"eab79a67-eb0a-4e47-b634-7047d26e3904","Type":"ContainerStarted","Data":"a6f2f8e9344a7cee8a5db49aff759d483277756d9bb03bd74d4f150cf69567ec"} Apr 21 02:41:13.096552 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:13.096502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" event={"ID":"773d483d-dfc3-4e6e-b1fa-f8da910c09d0","Type":"ContainerStarted","Data":"a85c897e5b10888c4ad690ec154007c8476ffc382ee0691b34297e53278be1fd"} Apr 21 02:41:13.097653 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:13.097634 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerStarted","Data":"507b42e5a6f2ff74d2a407cf9aa0e0df6747abbf6277afbe7604d336a6b599d3"} Apr 21 02:41:13.110413 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:13.110374 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c777784f9-mqwnx" podStartSLOduration=34.289175786 podStartE2EDuration="40.110363245s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:41:06.96821985 +0000 UTC m=+34.652764103" lastFinishedPulling="2026-04-21 02:41:12.789407324 +0000 UTC m=+40.473951562" observedRunningTime="2026-04-21 02:41:13.10928835 +0000 UTC m=+40.793832610" watchObservedRunningTime="2026-04-21 02:41:13.110363245 +0000 UTC m=+40.794907505" Apr 21 02:41:13.128654 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:13.128609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kqfw8" podStartSLOduration=8.743457166 podStartE2EDuration="40.128598096s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:40:35.605577451 +0000 UTC m=+3.290121700" lastFinishedPulling="2026-04-21 02:41:06.990718388 +0000 UTC m=+34.675262630" observedRunningTime="2026-04-21 02:41:13.128212066 +0000 UTC m=+40.812756326" watchObservedRunningTime="2026-04-21 02:41:13.128598096 +0000 UTC m=+40.813142356" Apr 21 02:41:14.038678 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:14.038644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:14.038936 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.038803 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:14.038936 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.038828 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:14.038936 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:14.038832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:14.038936 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.038889 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:22.038867548 +0000 UTC m=+49.723411792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:14.039135 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.038945 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:14.039135 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.038998 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:22.03897913 +0000 UTC m=+49.723523374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:14.101881 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:14.101807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" event={"ID":"92b14238-0b93-4982-a17d-81055b501a7b","Type":"ContainerStarted","Data":"7cfa42a335d8e0b84ce22f72ddcba9b99a41d3b13649a372544f3433faf51410"} Apr 21 02:41:14.115885 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:14.115841 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" podStartSLOduration=34.838340324 podStartE2EDuration="41.115825226s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:41:06.968084214 +0000 UTC m=+34.652628467" lastFinishedPulling="2026-04-21 02:41:13.24556913 +0000 UTC m=+40.930113369" observedRunningTime="2026-04-21 02:41:14.11525971 +0000 UTC m=+41.799803971" watchObservedRunningTime="2026-04-21 02:41:14.115825226 +0000 UTC m=+41.800369486" Apr 21 02:41:14.140367 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:14.140187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:14.140367 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.140273 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:14.140367 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:14.140338 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:22.140318105 +0000 UTC m=+49.824862354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:15.103512 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:15.103473 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:15.105084 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:15.105061 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cdc6df59c-qntjh" Apr 21 02:41:16.107359 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:16.107325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerStarted","Data":"b6457f1bae6ba9586b7ca6a3c072c24b7166eb5464b8582599a407a7d0ebbf02"} Apr 21 02:41:16.107359 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:16.107362 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerStarted","Data":"2877b409afbe4de7362703de4e5684e0a10ce8c553a244108b4f440ff3746f5e"} Apr 21 02:41:16.124035 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:16.123988 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" podStartSLOduration=34.532266714 podStartE2EDuration="43.123974689s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:41:06.968167769 +0000 UTC m=+34.652712021" lastFinishedPulling="2026-04-21 02:41:15.559875758 +0000 UTC m=+43.244419996" observedRunningTime="2026-04-21 02:41:16.123042525 +0000 UTC m=+43.807586785" watchObservedRunningTime="2026-04-21 02:41:16.123974689 +0000 UTC m=+43.808518950" Apr 21 02:41:22.097701 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:22.097659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:22.097721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.097802 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.097822 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.097847 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.097861 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.097848084 +0000 UTC m=+65.782392323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:22.098145 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.097917 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.097898615 +0000 UTC m=+65.782442864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:22.198418 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:22.198375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:22.198618 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.198536 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:22.198669 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:22.198637 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:41:38.198619319 +0000 UTC m=+65.883163557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:33.000040 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:33.000011 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cpdzb" Apr 21 02:41:38.123107 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.123070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.123130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.123212 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.123228 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.123230 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.123289 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:42:10.123275241 +0000 UTC m=+97.807819487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:41:38.123468 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.123302 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:10.123296592 +0000 UTC m=+97.807840830 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:41:38.223665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.223629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:41:38.223838 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.223780 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:41:38.223931 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.223857 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:42:10.223838016 +0000 UTC m=+97.908382260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:41:38.626196 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.626154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:38.626444 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.626238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:38.629114 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.629090 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 02:41:38.629228 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.629149 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 02:41:38.638895 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.638870 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 02:41:38.640002 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.639985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7cc43e1f-6f61-404a-ad72-d62ed23cea64-original-pull-secret\") pod \"global-pull-secret-syncer-xlxwj\" (UID: \"7cc43e1f-6f61-404a-ad72-d62ed23cea64\") " pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:38.649560 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.649537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcls5\" (UniqueName: \"kubernetes.io/projected/5d35cb86-19fa-41be-9ae2-d70d8dbe564d-kube-api-access-dcls5\") pod \"network-check-target-cbjx7\" (UID: \"5d35cb86-19fa-41be-9ae2-d70d8dbe564d\") " pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:38.727134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.727087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:41:38.729509 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.729486 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 02:41:38.738038 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.738019 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 02:41:38.738120 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:41:38.738100 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:42:42.738078915 +0000 UTC m=+130.422623165 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : secret "metrics-daemon-secret" not found Apr 21 02:41:38.822717 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.822679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xlxwj" Apr 21 02:41:38.836937 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.836916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nsvjd\"" Apr 21 02:41:38.845620 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.845595 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:38.946860 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.946817 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xlxwj"] Apr 21 02:41:38.951189 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:41:38.951151 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc43e1f_6f61_404a_ad72_d62ed23cea64.slice/crio-8f7b2f2f2e81e5e81a10c89059ff1068fb00863f20bf98c7e98896724001883f WatchSource:0}: Error finding container 8f7b2f2f2e81e5e81a10c89059ff1068fb00863f20bf98c7e98896724001883f: Status 404 returned error can't find the container with id 8f7b2f2f2e81e5e81a10c89059ff1068fb00863f20bf98c7e98896724001883f Apr 21 02:41:38.971120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:38.971092 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cbjx7"] Apr 21 02:41:38.986665 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:41:38.986635 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d35cb86_19fa_41be_9ae2_d70d8dbe564d.slice/crio-66639f122e0a03cd3fc15c4bbf2c9b1f05758f23e97a109e609908f86d460247 WatchSource:0}: Error finding container 66639f122e0a03cd3fc15c4bbf2c9b1f05758f23e97a109e609908f86d460247: Status 404 returned error can't find the container with id 66639f122e0a03cd3fc15c4bbf2c9b1f05758f23e97a109e609908f86d460247 Apr 21 02:41:39.151339 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:39.151235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xlxwj" event={"ID":"7cc43e1f-6f61-404a-ad72-d62ed23cea64","Type":"ContainerStarted","Data":"8f7b2f2f2e81e5e81a10c89059ff1068fb00863f20bf98c7e98896724001883f"} Apr 21 02:41:39.152207 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:39.152178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cbjx7" event={"ID":"5d35cb86-19fa-41be-9ae2-d70d8dbe564d","Type":"ContainerStarted","Data":"66639f122e0a03cd3fc15c4bbf2c9b1f05758f23e97a109e609908f86d460247"} Apr 21 02:41:43.163018 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:43.162977 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cbjx7" event={"ID":"5d35cb86-19fa-41be-9ae2-d70d8dbe564d","Type":"ContainerStarted","Data":"ac17854a34dfd79591ce03ae75067f073b538dd32b0fbb838156915db553b4af"} Apr 21 02:41:43.163473 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:43.163251 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:41:43.177233 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:43.177180 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cbjx7" podStartSLOduration=67.9086612 podStartE2EDuration="1m11.177165442s" podCreationTimestamp="2026-04-21 02:40:32 +0000 UTC" firstStartedPulling="2026-04-21 02:41:38.988497557 +0000 UTC m=+66.673041796" lastFinishedPulling="2026-04-21 02:41:42.257001783 +0000 UTC m=+69.941546038" observedRunningTime="2026-04-21 02:41:43.176356265 +0000 UTC m=+70.860900548" watchObservedRunningTime="2026-04-21 02:41:43.177165442 +0000 UTC m=+70.861709704" Apr 21 02:41:45.169819 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:45.169761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xlxwj" event={"ID":"7cc43e1f-6f61-404a-ad72-d62ed23cea64","Type":"ContainerStarted","Data":"1176bf51ce222ec877ca89cada7e987d2e40d1417358894e6e96539c5e5f84f4"} Apr 21 02:41:45.182840 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:41:45.182779 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xlxwj" podStartSLOduration=66.856450446 podStartE2EDuration="1m12.18276134s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:41:38.952937254 +0000 UTC m=+66.637481507" lastFinishedPulling="2026-04-21 02:41:44.279248158 +0000 UTC m=+71.963792401" observedRunningTime="2026-04-21 02:41:45.182540788 +0000 UTC m=+72.867085043" watchObservedRunningTime="2026-04-21 02:41:45.18276134 +0000 UTC m=+72.867305603" Apr 21 02:42:10.162437 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:10.162396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:10.162458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.162561 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.162576 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.162583 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7c6659-qqb6n: secret "image-registry-tls" not found Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.162652 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls podName:4fc79abb-a912-47eb-a326-62647f5c9486 nodeName:}" failed. No retries permitted until 2026-04-21 02:43:14.162635328 +0000 UTC m=+161.847179571 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls") pod "image-registry-55f7c6659-qqb6n" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486") : secret "image-registry-tls" not found Apr 21 02:42:10.162909 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.162667 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert podName:4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a nodeName:}" failed. No retries permitted until 2026-04-21 02:43:14.162660668 +0000 UTC m=+161.847204906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert") pod "ingress-canary-zn5hz" (UID: "4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a") : secret "canary-serving-cert" not found Apr 21 02:42:10.263596 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:10.263562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:42:10.263762 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.263690 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 02:42:10.263762 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:10.263753 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls podName:e8ab2f22-c931-49de-80fd-45193fa7eda9 nodeName:}" failed. No retries permitted until 2026-04-21 02:43:14.26373631 +0000 UTC m=+161.948280556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls") pod "dns-default-4c7td" (UID: "e8ab2f22-c931-49de-80fd-45193fa7eda9") : secret "dns-default-metrics-tls" not found Apr 21 02:42:14.167365 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:14.167330 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cbjx7" Apr 21 02:42:22.556184 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:22.556155 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9mbn4_a9bda1dd-f3d4-41e7-9167-d144e08a951c/dns-node-resolver/0.log" Apr 21 02:42:23.756383 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:23.756353 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wwdwn_d62dffc6-07e2-43c5-929f-e5547bc6cbb9/node-ca/0.log" Apr 21 02:42:42.811801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:42.811751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:42:42.812358 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:42.811886 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 02:42:42.812358 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:42:42.811963 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs podName:c2a4d15a-56b4-43a2-b85f-305025a28b5e nodeName:}" failed. No retries permitted until 2026-04-21 02:44:44.811939745 +0000 UTC m=+252.496483993 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs") pod "network-metrics-daemon-bzdk8" (UID: "c2a4d15a-56b4-43a2-b85f-305025a28b5e") : secret "metrics-daemon-secret" not found Apr 21 02:42:55.020041 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.020007 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9khlr"] Apr 21 02:42:55.022908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.022891 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.026705 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.026681 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 02:42:55.026848 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.026684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s6bnd\"" Apr 21 02:42:55.027021 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.027005 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 02:42:55.027067 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.027045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 02:42:55.027439 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.027426 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 02:42:55.039462 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.039400 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9khlr"] Apr 21 02:42:55.112414 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.112367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf6hb\" (UniqueName: \"kubernetes.io/projected/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-api-access-lf6hb\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.112414 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.112418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04ae7926-9039-48f6-910e-3c6deeb48e8a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.112670 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.112560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04ae7926-9039-48f6-910e-3c6deeb48e8a-crio-socket\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.112670 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.112597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.112737 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.112681 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04ae7926-9039-48f6-910e-3c6deeb48e8a-data-volume\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213722 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04ae7926-9039-48f6-910e-3c6deeb48e8a-crio-socket\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213722 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/04ae7926-9039-48f6-910e-3c6deeb48e8a-crio-socket\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213884 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04ae7926-9039-48f6-910e-3c6deeb48e8a-data-volume\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213971 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf6hb\" (UniqueName: \"kubernetes.io/projected/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-api-access-lf6hb\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.213971 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.213955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04ae7926-9039-48f6-910e-3c6deeb48e8a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.214137 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.214121 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/04ae7926-9039-48f6-910e-3c6deeb48e8a-data-volume\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.214338 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.214319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.216294 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.216273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/04ae7926-9039-48f6-910e-3c6deeb48e8a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.223819 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.223793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf6hb\" (UniqueName: \"kubernetes.io/projected/04ae7926-9039-48f6-910e-3c6deeb48e8a-kube-api-access-lf6hb\") pod \"insights-runtime-extractor-9khlr\" (UID: \"04ae7926-9039-48f6-910e-3c6deeb48e8a\") " pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.332277 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.332198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9khlr" Apr 21 02:42:55.452735 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:55.452701 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9khlr"] Apr 21 02:42:55.456394 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:42:55.456356 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04ae7926_9039_48f6_910e_3c6deeb48e8a.slice/crio-dd81976e1db7d3f3b02c1c7c31e73f59cddf2c76a8240d9054375f2101c4fc3b WatchSource:0}: Error finding container dd81976e1db7d3f3b02c1c7c31e73f59cddf2c76a8240d9054375f2101c4fc3b: Status 404 returned error can't find the container with id dd81976e1db7d3f3b02c1c7c31e73f59cddf2c76a8240d9054375f2101c4fc3b Apr 21 02:42:56.333896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:56.333857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9khlr" event={"ID":"04ae7926-9039-48f6-910e-3c6deeb48e8a","Type":"ContainerStarted","Data":"efe1b9153d787a62dd62879e169513c1fb3a610d914f35806193bad99e3e04db"} Apr 21 02:42:56.333896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:56.333896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9khlr" event={"ID":"04ae7926-9039-48f6-910e-3c6deeb48e8a","Type":"ContainerStarted","Data":"26c5032d3b4093aae14fdd786c222279f02936d33dbfcc0d8372c0ad9443d457"} Apr 21 02:42:56.334311 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:56.333909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9khlr" event={"ID":"04ae7926-9039-48f6-910e-3c6deeb48e8a","Type":"ContainerStarted","Data":"dd81976e1db7d3f3b02c1c7c31e73f59cddf2c76a8240d9054375f2101c4fc3b"} Apr 21 02:42:58.340257 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:58.340218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9khlr" event={"ID":"04ae7926-9039-48f6-910e-3c6deeb48e8a","Type":"ContainerStarted","Data":"9a68bbf1c268a40f59727e9d243148886ced83749e12d1603bdb31a34b2c251f"} Apr 21 02:42:58.357820 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:42:58.357752 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9khlr" podStartSLOduration=2.551066181 podStartE2EDuration="4.35773498s" podCreationTimestamp="2026-04-21 02:42:54 +0000 UTC" firstStartedPulling="2026-04-21 02:42:55.513436707 +0000 UTC m=+143.197980945" lastFinishedPulling="2026-04-21 02:42:57.320105494 +0000 UTC m=+145.004649744" observedRunningTime="2026-04-21 02:42:58.357203466 +0000 UTC m=+146.041747727" watchObservedRunningTime="2026-04-21 02:42:58.35773498 +0000 UTC m=+146.042279258" Apr 21 02:43:01.916471 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.916416 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bl4vc"] Apr 21 02:43:01.920197 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.920164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:01.922796 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.922769 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-blx9s\"" Apr 21 02:43:01.922961 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.922852 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 02:43:01.923717 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.923688 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 02:43:01.923830 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.923808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 02:43:01.923884 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.923847 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 02:43:01.924155 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.924141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 02:43:01.924155 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:01.924149 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 02:43:02.068473 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-accelerators-collector-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-sys\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-tls\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-metrics-client-ca\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-textfile\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068904 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnnjj\" (UniqueName: \"kubernetes.io/projected/1d063354-c5a3-4ad2-a124-25953ad1623e-kube-api-access-xnnjj\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068904 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-wtmp\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.068904 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.068778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-root\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.169998 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.169910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-tls\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.169998 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.169950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.169998 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.169970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-metrics-client-ca\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170195 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-textfile\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170241 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnnjj\" (UniqueName: \"kubernetes.io/projected/1d063354-c5a3-4ad2-a124-25953ad1623e-kube-api-access-xnnjj\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170241 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-wtmp\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170327 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-root\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170382 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-accelerators-collector-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170431 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-wtmp\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170431 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-sys\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170512 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-sys\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170512 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-textfile\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d063354-c5a3-4ad2-a124-25953ad1623e-root\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-metrics-client-ca\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.170900 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.170882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-accelerators-collector-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.172830 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.172807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.172951 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.172927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d063354-c5a3-4ad2-a124-25953ad1623e-node-exporter-tls\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.177160 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.177138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnnjj\" (UniqueName: \"kubernetes.io/projected/1d063354-c5a3-4ad2-a124-25953ad1623e-kube-api-access-xnnjj\") pod \"node-exporter-bl4vc\" (UID: \"1d063354-c5a3-4ad2-a124-25953ad1623e\") " pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.229182 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.229152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bl4vc" Apr 21 02:43:02.239582 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:43:02.239556 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d063354_c5a3_4ad2_a124_25953ad1623e.slice/crio-a5fa2875ba40a702a2e853c1ab4fcf165c538422ccedc95e6f5c93f9d20f4672 WatchSource:0}: Error finding container a5fa2875ba40a702a2e853c1ab4fcf165c538422ccedc95e6f5c93f9d20f4672: Status 404 returned error can't find the container with id a5fa2875ba40a702a2e853c1ab4fcf165c538422ccedc95e6f5c93f9d20f4672 Apr 21 02:43:02.351222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:02.351177 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bl4vc" event={"ID":"1d063354-c5a3-4ad2-a124-25953ad1623e","Type":"ContainerStarted","Data":"a5fa2875ba40a702a2e853c1ab4fcf165c538422ccedc95e6f5c93f9d20f4672"} Apr 21 02:43:03.354511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:03.354476 2572 generic.go:358] "Generic (PLEG): container finished" podID="1d063354-c5a3-4ad2-a124-25953ad1623e" containerID="0c1665716090262276baecbf963f8a2a2d6c6f649ec523ea6736d2b0fb42089c" exitCode=0 Apr 21 02:43:03.354892 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:03.354563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bl4vc" event={"ID":"1d063354-c5a3-4ad2-a124-25953ad1623e","Type":"ContainerDied","Data":"0c1665716090262276baecbf963f8a2a2d6c6f649ec523ea6736d2b0fb42089c"} Apr 21 02:43:04.358256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:04.358219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bl4vc" event={"ID":"1d063354-c5a3-4ad2-a124-25953ad1623e","Type":"ContainerStarted","Data":"4b8017ec0909ef24830c2a3c6ec40e5f928203266e1a621876b47206c8cc0a63"} Apr 21 02:43:04.358256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:04.358258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bl4vc" event={"ID":"1d063354-c5a3-4ad2-a124-25953ad1623e","Type":"ContainerStarted","Data":"2fbe7df3f050916ecb396d04e761aa4c0a94549301401c414c0f1e2f0bd100a4"} Apr 21 02:43:04.378797 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:04.378746 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bl4vc" podStartSLOduration=2.691889552 podStartE2EDuration="3.378728797s" podCreationTimestamp="2026-04-21 02:43:01 +0000 UTC" firstStartedPulling="2026-04-21 02:43:02.241705727 +0000 UTC m=+149.926249968" lastFinishedPulling="2026-04-21 02:43:02.928544975 +0000 UTC m=+150.613089213" observedRunningTime="2026-04-21 02:43:04.377512844 +0000 UTC m=+152.062057105" watchObservedRunningTime="2026-04-21 02:43:04.378728797 +0000 UTC m=+152.063273099" Apr 21 02:43:09.242004 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:43:09.241934 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" Apr 21 02:43:09.266976 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:43:09.266919 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zn5hz" podUID="4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a" Apr 21 02:43:09.288124 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:43:09.288087 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4c7td" podUID="e8ab2f22-c931-49de-80fd-45193fa7eda9" Apr 21 02:43:09.370730 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:09.370703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:10.943215 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:43:10.943170 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bzdk8" podUID="c2a4d15a-56b4-43a2-b85f-305025a28b5e" Apr 21 02:43:14.175774 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.175733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:14.176254 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.175786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:43:14.178343 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.178317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a-cert\") pod \"ingress-canary-zn5hz\" (UID: \"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a\") " pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:43:14.178443 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.178353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"image-registry-55f7c6659-qqb6n\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:14.276834 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.276795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:14.279663 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.279643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ab2f22-c931-49de-80fd-45193fa7eda9-metrics-tls\") pod \"dns-default-4c7td\" (UID: \"e8ab2f22-c931-49de-80fd-45193fa7eda9\") " pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:14.473978 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.473945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlcjx\"" Apr 21 02:43:14.481839 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.481808 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:14.601537 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:14.601487 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:43:14.604567 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:43:14.604514 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc79abb_a912_47eb_a326_62647f5c9486.slice/crio-2b615f40d3e9cd490c911078153c45406c9b62fc2d5b42205cf828b4c3118b75 WatchSource:0}: Error finding container 2b615f40d3e9cd490c911078153c45406c9b62fc2d5b42205cf828b4c3118b75: Status 404 returned error can't find the container with id 2b615f40d3e9cd490c911078153c45406c9b62fc2d5b42205cf828b4c3118b75 Apr 21 02:43:15.386431 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:15.386392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" event={"ID":"4fc79abb-a912-47eb-a326-62647f5c9486","Type":"ContainerStarted","Data":"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31"} Apr 21 02:43:15.386431 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:15.386435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" event={"ID":"4fc79abb-a912-47eb-a326-62647f5c9486","Type":"ContainerStarted","Data":"2b615f40d3e9cd490c911078153c45406c9b62fc2d5b42205cf828b4c3118b75"} Apr 21 02:43:15.386895 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:15.386555 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:15.404261 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:15.404209 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" podStartSLOduration=162.404193929 podStartE2EDuration="2m42.404193929s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:43:15.403896293 +0000 UTC m=+163.088440564" watchObservedRunningTime="2026-04-21 02:43:15.404193929 +0000 UTC m=+163.088738188" Apr 21 02:43:17.009509 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:17.009475 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:43:21.905185 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:21.905143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:43:21.905604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:21.905142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:21.907249 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:21.907233 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6wzzj\"" Apr 21 02:43:21.915818 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:21.915793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:22.034428 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:22.034395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4c7td"] Apr 21 02:43:22.037792 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:43:22.037760 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ab2f22_c931_49de_80fd_45193fa7eda9.slice/crio-5bf551c0bd52ee6b24bcb88b3c18400e4aebf9bf1905cb9c9e0b8dd9ecc64906 WatchSource:0}: Error finding container 5bf551c0bd52ee6b24bcb88b3c18400e4aebf9bf1905cb9c9e0b8dd9ecc64906: Status 404 returned error can't find the container with id 5bf551c0bd52ee6b24bcb88b3c18400e4aebf9bf1905cb9c9e0b8dd9ecc64906 Apr 21 02:43:22.406062 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:22.406023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4c7td" event={"ID":"e8ab2f22-c931-49de-80fd-45193fa7eda9","Type":"ContainerStarted","Data":"5bf551c0bd52ee6b24bcb88b3c18400e4aebf9bf1905cb9c9e0b8dd9ecc64906"} Apr 21 02:43:22.906829 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:22.906794 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:43:22.909490 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:22.909469 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lm7rd\"" Apr 21 02:43:22.918239 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:22.918217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zn5hz" Apr 21 02:43:23.324787 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:23.324744 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zn5hz"] Apr 21 02:43:23.328319 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:43:23.328280 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bdbc6e8_8ad6_4643_b8ca_75ba5c93e85a.slice/crio-1eb0a1c722e15236876d1153300bacbffc62a2fcdc09bbd9db14e8342121d927 WatchSource:0}: Error finding container 1eb0a1c722e15236876d1153300bacbffc62a2fcdc09bbd9db14e8342121d927: Status 404 returned error can't find the container with id 1eb0a1c722e15236876d1153300bacbffc62a2fcdc09bbd9db14e8342121d927 Apr 21 02:43:23.410275 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:23.410201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zn5hz" event={"ID":"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a","Type":"ContainerStarted","Data":"1eb0a1c722e15236876d1153300bacbffc62a2fcdc09bbd9db14e8342121d927"} Apr 21 02:43:23.411940 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:23.411905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4c7td" event={"ID":"e8ab2f22-c931-49de-80fd-45193fa7eda9","Type":"ContainerStarted","Data":"4f398c5b04f136677919c074c550962e35753df84d3ac0fa6dde15138a343431"} Apr 21 02:43:24.417715 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:24.417678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4c7td" event={"ID":"e8ab2f22-c931-49de-80fd-45193fa7eda9","Type":"ContainerStarted","Data":"186e6d86b4c9b159eff3e196d6e23c3cd90fe6b114082bfd3b7a7e07c03d02fa"} Apr 21 02:43:24.418104 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:24.417795 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:24.431970 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:24.431905 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4c7td" podStartSLOduration=137.223209063 podStartE2EDuration="2m18.431887287s" podCreationTimestamp="2026-04-21 02:41:06 +0000 UTC" firstStartedPulling="2026-04-21 02:43:22.040097437 +0000 UTC m=+169.724641678" lastFinishedPulling="2026-04-21 02:43:23.24877566 +0000 UTC m=+170.933319902" observedRunningTime="2026-04-21 02:43:24.431661154 +0000 UTC m=+172.116205415" watchObservedRunningTime="2026-04-21 02:43:24.431887287 +0000 UTC m=+172.116431552" Apr 21 02:43:25.421799 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:25.421753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zn5hz" event={"ID":"4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a","Type":"ContainerStarted","Data":"544bcc563139da82680ec469ee7b20a40f6dccd3300ebefb77600e4fb9a0a712"} Apr 21 02:43:25.437568 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:25.437497 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zn5hz" podStartSLOduration=137.861407848 podStartE2EDuration="2m19.43748294s" podCreationTimestamp="2026-04-21 02:41:06 +0000 UTC" firstStartedPulling="2026-04-21 02:43:23.330469144 +0000 UTC m=+171.015013381" lastFinishedPulling="2026-04-21 02:43:24.906544222 +0000 UTC m=+172.591088473" observedRunningTime="2026-04-21 02:43:25.436717043 +0000 UTC m=+173.121261304" watchObservedRunningTime="2026-04-21 02:43:25.43748294 +0000 UTC m=+173.122027199" Apr 21 02:43:34.424019 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:34.423985 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4c7td" Apr 21 02:43:37.397256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:37.397226 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:42.409933 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:42.409864 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" containerName="registry" containerID="cri-o://6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31" gracePeriod=30 Apr 21 02:43:43.643985 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.643959 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:43.697641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697603 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697641 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697648 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697668 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697687 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxnpq\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697703 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697740 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697774 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.697894 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.697812 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") pod \"4fc79abb-a912-47eb-a326-62647f5c9486\" (UID: \"4fc79abb-a912-47eb-a326-62647f5c9486\") " Apr 21 02:43:43.698222 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.698184 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:43:43.698562 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.698500 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:43:43.700235 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.700173 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:43:43.700347 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.700189 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:43:43.700412 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.700380 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:43:43.700464 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.700403 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq" (OuterVolumeSpecName: "kube-api-access-wxnpq") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "kube-api-access-wxnpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:43:43.700610 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.700587 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:43:43.706904 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.706875 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4fc79abb-a912-47eb-a326-62647f5c9486" (UID: "4fc79abb-a912-47eb-a326-62647f5c9486"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799109 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc79abb-a912-47eb-a326-62647f5c9486-ca-trust-extracted\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799140 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-bound-sa-token\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799150 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-trusted-ca\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799158 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxnpq\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-kube-api-access-wxnpq\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799168 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc79abb-a912-47eb-a326-62647f5c9486-registry-certificates\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799178 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-image-registry-private-configuration\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799188 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc79abb-a912-47eb-a326-62647f5c9486-installation-pull-secrets\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:43.799188 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:43.799196 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc79abb-a912-47eb-a326-62647f5c9486-registry-tls\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:43:44.467927 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.467885 2572 generic.go:358] "Generic (PLEG): container finished" podID="4fc79abb-a912-47eb-a326-62647f5c9486" containerID="6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31" exitCode=0 Apr 21 02:43:44.468136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.467936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" event={"ID":"4fc79abb-a912-47eb-a326-62647f5c9486","Type":"ContainerDied","Data":"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31"} Apr 21 02:43:44.468136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.467941 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" Apr 21 02:43:44.468136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.467965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7c6659-qqb6n" event={"ID":"4fc79abb-a912-47eb-a326-62647f5c9486","Type":"ContainerDied","Data":"2b615f40d3e9cd490c911078153c45406c9b62fc2d5b42205cf828b4c3118b75"} Apr 21 02:43:44.468136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.467981 2572 scope.go:117] "RemoveContainer" containerID="6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31" Apr 21 02:43:44.476256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.476237 2572 scope.go:117] "RemoveContainer" containerID="6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31" Apr 21 02:43:44.476498 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:43:44.476480 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31\": container with ID starting with 6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31 not found: ID does not exist" containerID="6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31" Apr 21 02:43:44.476563 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.476507 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31"} err="failed to get container status \"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31\": rpc error: code = NotFound desc = could not find container \"6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31\": container with ID starting with 6ad8458b94fe566eb7ae33ac35101065db05e2ca72a563ce80f06e4cf9235c31 not found: ID does not exist" Apr 21 02:43:44.485876 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.485848 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:43:44.491554 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.491516 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55f7c6659-qqb6n"] Apr 21 02:43:44.907986 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:44.907902 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" path="/var/lib/kubelet/pods/4fc79abb-a912-47eb-a326-62647f5c9486/volumes" Apr 21 02:43:56.498080 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:43:56.498040 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" podUID="9aa02d10-4b83-4818-ab56-8c51e74f5f5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 02:44:06.498074 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:06.498036 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" podUID="9aa02d10-4b83-4818-ab56-8c51e74f5f5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 02:44:16.498242 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:16.498197 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" podUID="9aa02d10-4b83-4818-ab56-8c51e74f5f5d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 02:44:16.498748 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:16.498281 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" Apr 21 02:44:16.498807 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:16.498746 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"b6457f1bae6ba9586b7ca6a3c072c24b7166eb5464b8582599a407a7d0ebbf02"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 02:44:16.498807 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:16.498780 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" podUID="9aa02d10-4b83-4818-ab56-8c51e74f5f5d" containerName="service-proxy" containerID="cri-o://b6457f1bae6ba9586b7ca6a3c072c24b7166eb5464b8582599a407a7d0ebbf02" gracePeriod=30 Apr 21 02:44:17.552993 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:17.552963 2572 generic.go:358] "Generic (PLEG): container finished" podID="9aa02d10-4b83-4818-ab56-8c51e74f5f5d" containerID="b6457f1bae6ba9586b7ca6a3c072c24b7166eb5464b8582599a407a7d0ebbf02" exitCode=2 Apr 21 02:44:17.553372 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:17.553025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerDied","Data":"b6457f1bae6ba9586b7ca6a3c072c24b7166eb5464b8582599a407a7d0ebbf02"} Apr 21 02:44:17.553372 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:17.553062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5454cdfc-wgbjk" event={"ID":"9aa02d10-4b83-4818-ab56-8c51e74f5f5d","Type":"ContainerStarted","Data":"8ff48e1b9090e3ac6267a61c0f6bc307935e328732ee1f3eccc202789e1aafdd"} Apr 21 02:44:44.850082 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:44.850045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:44:44.852325 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:44.852294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2a4d15a-56b4-43a2-b85f-305025a28b5e-metrics-certs\") pod \"network-metrics-daemon-bzdk8\" (UID: \"c2a4d15a-56b4-43a2-b85f-305025a28b5e\") " pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:44:45.009199 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:45.009166 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j2fpz\"" Apr 21 02:44:45.016284 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:45.016263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzdk8" Apr 21 02:44:45.140023 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:45.139941 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzdk8"] Apr 21 02:44:45.144351 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:44:45.144305 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a4d15a_56b4_43a2_b85f_305025a28b5e.slice/crio-1e72a1d95f660fb2dc5554e793d989d8e82472c1391ba93b8d4e66ff27c1aec1 WatchSource:0}: Error finding container 1e72a1d95f660fb2dc5554e793d989d8e82472c1391ba93b8d4e66ff27c1aec1: Status 404 returned error can't find the container with id 1e72a1d95f660fb2dc5554e793d989d8e82472c1391ba93b8d4e66ff27c1aec1 Apr 21 02:44:45.626301 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:45.626265 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzdk8" event={"ID":"c2a4d15a-56b4-43a2-b85f-305025a28b5e","Type":"ContainerStarted","Data":"1e72a1d95f660fb2dc5554e793d989d8e82472c1391ba93b8d4e66ff27c1aec1"} Apr 21 02:44:46.629835 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:46.629789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzdk8" event={"ID":"c2a4d15a-56b4-43a2-b85f-305025a28b5e","Type":"ContainerStarted","Data":"a64400bab6dac53c593e3752a82ef8ab197f266dc28c2f01641d05d0115ff6ee"} Apr 21 02:44:46.629835 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:46.629827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzdk8" event={"ID":"c2a4d15a-56b4-43a2-b85f-305025a28b5e","Type":"ContainerStarted","Data":"0ec45bbd69a7a3b5cfb8d669422fc5d8a4f250bc18eaecf869f88e56f08a625b"} Apr 21 02:44:46.650968 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:44:46.650916 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bzdk8" podStartSLOduration=252.780062379 podStartE2EDuration="4m13.650900241s" podCreationTimestamp="2026-04-21 02:40:33 +0000 UTC" firstStartedPulling="2026-04-21 02:44:45.146171226 +0000 UTC m=+252.830715468" lastFinishedPulling="2026-04-21 02:44:46.01700909 +0000 UTC m=+253.701553330" observedRunningTime="2026-04-21 02:44:46.649164918 +0000 UTC m=+254.333709177" watchObservedRunningTime="2026-04-21 02:44:46.650900241 +0000 UTC m=+254.335444501" Apr 21 02:45:32.804991 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:45:32.804943 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 02:47:02.696176 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.696098 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-rpbwv"] Apr 21 02:47:02.696652 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.696332 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" containerName="registry" Apr 21 02:47:02.696652 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.696344 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" containerName="registry" Apr 21 02:47:02.696652 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.696379 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fc79abb-a912-47eb-a326-62647f5c9486" containerName="registry" Apr 21 02:47:02.699016 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.699000 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.701377 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.701351 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 02:47:02.702085 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.702069 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-gx8c8\"" Apr 21 02:47:02.702158 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.702069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 02:47:02.711492 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.711465 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-rpbwv"] Apr 21 02:47:02.803436 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.803392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-bound-sa-token\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.803436 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.803441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsls\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-kube-api-access-lbsls\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.904704 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.904672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsls\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-kube-api-access-lbsls\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.904812 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.904776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-bound-sa-token\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.912287 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.912258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-bound-sa-token\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:02.912420 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:02.912344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsls\" (UniqueName: \"kubernetes.io/projected/e61ffe11-0244-4ccb-a7bb-3776ecfc249b-kube-api-access-lbsls\") pod \"cert-manager-79c8d999ff-rpbwv\" (UID: \"e61ffe11-0244-4ccb-a7bb-3776ecfc249b\") " pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:03.007998 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:03.007924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-rpbwv" Apr 21 02:47:03.125594 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:03.125558 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-rpbwv"] Apr 21 02:47:03.128747 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:47:03.128710 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode61ffe11_0244_4ccb_a7bb_3776ecfc249b.slice/crio-0015adc1fdbeda13251c7215c707a4d94448e50a4b19d960cec558e492579fe6 WatchSource:0}: Error finding container 0015adc1fdbeda13251c7215c707a4d94448e50a4b19d960cec558e492579fe6: Status 404 returned error can't find the container with id 0015adc1fdbeda13251c7215c707a4d94448e50a4b19d960cec558e492579fe6 Apr 21 02:47:03.130638 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:03.130620 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:47:03.982241 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:03.982197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-rpbwv" event={"ID":"e61ffe11-0244-4ccb-a7bb-3776ecfc249b","Type":"ContainerStarted","Data":"0015adc1fdbeda13251c7215c707a4d94448e50a4b19d960cec558e492579fe6"} Apr 21 02:47:06.995730 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:06.995682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-rpbwv" event={"ID":"e61ffe11-0244-4ccb-a7bb-3776ecfc249b","Type":"ContainerStarted","Data":"101d3601d72b70b3c25411d74eb82aff4e32748500abe07168599e8cc72ddcd3"} Apr 21 02:47:07.011561 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:07.011487 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-rpbwv" podStartSLOduration=2.214147729 podStartE2EDuration="5.011468332s" podCreationTimestamp="2026-04-21 02:47:02 +0000 UTC" firstStartedPulling="2026-04-21 02:47:03.130780512 +0000 UTC m=+390.815324754" lastFinishedPulling="2026-04-21 02:47:05.928101104 +0000 UTC m=+393.612645357" observedRunningTime="2026-04-21 02:47:07.009719235 +0000 UTC m=+394.694263494" watchObservedRunningTime="2026-04-21 02:47:07.011468332 +0000 UTC m=+394.696012591" Apr 21 02:47:19.912123 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.912092 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh"] Apr 21 02:47:19.915214 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.915196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:19.917374 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.917351 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 02:47:19.917827 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.917796 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 02:47:19.918012 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.917994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pmqhb\"" Apr 21 02:47:19.918380 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.918366 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 02:47:19.919397 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.919381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 02:47:19.929361 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:19.929340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh"] Apr 21 02:47:20.037376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.037344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lns4k\" (UniqueName: \"kubernetes.io/projected/63f53f70-1b73-4784-89ca-6b4c169a1521-kube-api-access-lns4k\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.037376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.037379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.037609 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.037409 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.138448 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.138416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lns4k\" (UniqueName: \"kubernetes.io/projected/63f53f70-1b73-4784-89ca-6b4c169a1521-kube-api-access-lns4k\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.138544 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.138454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.138544 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.138476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.140909 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.140881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.141008 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.140933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63f53f70-1b73-4784-89ca-6b4c169a1521-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.146353 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.146328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lns4k\" (UniqueName: \"kubernetes.io/projected/63f53f70-1b73-4784-89ca-6b4c169a1521-kube-api-access-lns4k\") pod \"opendatahub-operator-controller-manager-5f4d6bff-tcbdh\" (UID: \"63f53f70-1b73-4784-89ca-6b4c169a1521\") " pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.224854 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.224816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:20.343342 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:20.343311 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh"] Apr 21 02:47:20.346836 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:47:20.346807 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f53f70_1b73_4784_89ca_6b4c169a1521.slice/crio-4ec1d542e9c0c04bc01fbcc959ea3b4d20bd3ea59bb1b743115a6c6f8e935eae WatchSource:0}: Error finding container 4ec1d542e9c0c04bc01fbcc959ea3b4d20bd3ea59bb1b743115a6c6f8e935eae: Status 404 returned error can't find the container with id 4ec1d542e9c0c04bc01fbcc959ea3b4d20bd3ea59bb1b743115a6c6f8e935eae Apr 21 02:47:21.034706 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:21.034665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" event={"ID":"63f53f70-1b73-4784-89ca-6b4c169a1521","Type":"ContainerStarted","Data":"4ec1d542e9c0c04bc01fbcc959ea3b4d20bd3ea59bb1b743115a6c6f8e935eae"} Apr 21 02:47:23.043014 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:23.042969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" event={"ID":"63f53f70-1b73-4784-89ca-6b4c169a1521","Type":"ContainerStarted","Data":"3f4ce5590025729bb712bc5812bc943b04bab700ced5f8a0d3acad7afed8bef4"} Apr 21 02:47:23.043428 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:23.043117 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:23.062809 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:23.062704 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" podStartSLOduration=1.60785299 podStartE2EDuration="4.062689226s" podCreationTimestamp="2026-04-21 02:47:19 +0000 UTC" firstStartedPulling="2026-04-21 02:47:20.348507885 +0000 UTC m=+408.033052128" lastFinishedPulling="2026-04-21 02:47:22.803344123 +0000 UTC m=+410.487888364" observedRunningTime="2026-04-21 02:47:23.061793305 +0000 UTC m=+410.746337566" watchObservedRunningTime="2026-04-21 02:47:23.062689226 +0000 UTC m=+410.747233485" Apr 21 02:47:34.048121 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:34.048091 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f4d6bff-tcbdh" Apr 21 02:47:50.465033 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.464994 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67"] Apr 21 02:47:50.472141 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.472110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.475119 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.475078 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 02:47:50.475297 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.475121 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 02:47:50.475297 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.475083 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-2526x\"" Apr 21 02:47:50.475297 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.475082 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 02:47:50.475297 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.475234 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 02:47:50.477698 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.477662 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67"] Apr 21 02:47:50.559273 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.559237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.559442 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.559277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptstm\" (UniqueName: \"kubernetes.io/projected/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-kube-api-access-ptstm\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.559442 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.559400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tmp\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.660621 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.660585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.660804 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.660628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptstm\" (UniqueName: \"kubernetes.io/projected/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-kube-api-access-ptstm\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.660804 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.660685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tmp\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.663046 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.663022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tmp\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.663280 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.663261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-tls-certs\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.669095 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.669070 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptstm\" (UniqueName: \"kubernetes.io/projected/838a4b2c-71ad-43cc-ad01-797dad3a3ecf-kube-api-access-ptstm\") pod \"kube-auth-proxy-55fc66fcf7-wdh67\" (UID: \"838a4b2c-71ad-43cc-ad01-797dad3a3ecf\") " pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.782552 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.782433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" Apr 21 02:47:50.912891 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:50.912864 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67"] Apr 21 02:47:51.115029 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.114938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" event={"ID":"838a4b2c-71ad-43cc-ad01-797dad3a3ecf","Type":"ContainerStarted","Data":"43603cf935c86ccfe233521a314f15a1051e89a3ecb3b34769c93b596a285075"} Apr 21 02:47:51.187897 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.187866 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gmv48"] Apr 21 02:47:51.192309 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.192289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.194510 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.194478 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 02:47:51.194643 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.194509 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-lqlfl\"" Apr 21 02:47:51.197597 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.197572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gmv48"] Apr 21 02:47:51.265629 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.265591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.265817 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.265658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6schn\" (UniqueName: \"kubernetes.io/projected/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-kube-api-access-6schn\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.366960 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.366885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6schn\" (UniqueName: \"kubernetes.io/projected/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-kube-api-access-6schn\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.366960 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.366937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.367142 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:51.367041 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 02:47:51.367142 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:51.367116 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert podName:845205e9-4ed6-439b-9e1e-f2f66fecc8f3 nodeName:}" failed. No retries permitted until 2026-04-21 02:47:51.867096466 +0000 UTC m=+439.551640703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert") pod "odh-model-controller-858dbf95b8-gmv48" (UID: "845205e9-4ed6-439b-9e1e-f2f66fecc8f3") : secret "odh-model-controller-webhook-cert" not found Apr 21 02:47:51.375511 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.375478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6schn\" (UniqueName: \"kubernetes.io/projected/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-kube-api-access-6schn\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.872431 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:51.872390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:51.872895 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:51.872586 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 02:47:51.872895 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:51.872660 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert podName:845205e9-4ed6-439b-9e1e-f2f66fecc8f3 nodeName:}" failed. No retries permitted until 2026-04-21 02:47:52.872639609 +0000 UTC m=+440.557183861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert") pod "odh-model-controller-858dbf95b8-gmv48" (UID: "845205e9-4ed6-439b-9e1e-f2f66fecc8f3") : secret "odh-model-controller-webhook-cert" not found Apr 21 02:47:52.881937 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:52.881897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:52.884970 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:52.884942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/845205e9-4ed6-439b-9e1e-f2f66fecc8f3-cert\") pod \"odh-model-controller-858dbf95b8-gmv48\" (UID: \"845205e9-4ed6-439b-9e1e-f2f66fecc8f3\") " pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:53.004086 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:53.004046 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:47:53.138556 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:53.138421 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-gmv48"] Apr 21 02:47:53.144280 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:47:53.144245 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod845205e9_4ed6_439b_9e1e_f2f66fecc8f3.slice/crio-52ae8da032ef6091051322a21aa57b669c040a2af8ffa931651716a48a883f8f WatchSource:0}: Error finding container 52ae8da032ef6091051322a21aa57b669c040a2af8ffa931651716a48a883f8f: Status 404 returned error can't find the container with id 52ae8da032ef6091051322a21aa57b669c040a2af8ffa931651716a48a883f8f Apr 21 02:47:54.124707 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:54.124666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" event={"ID":"838a4b2c-71ad-43cc-ad01-797dad3a3ecf","Type":"ContainerStarted","Data":"cfe5b59f4a0e18dbb7859e6cf8a32e36b5ecb181f533008efc61829d7ae497a5"} Apr 21 02:47:54.125786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:54.125760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" event={"ID":"845205e9-4ed6-439b-9e1e-f2f66fecc8f3","Type":"ContainerStarted","Data":"52ae8da032ef6091051322a21aa57b669c040a2af8ffa931651716a48a883f8f"} Apr 21 02:47:54.140406 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:54.140352 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-55fc66fcf7-wdh67" podStartSLOduration=1.033708261 podStartE2EDuration="4.140335423s" podCreationTimestamp="2026-04-21 02:47:50 +0000 UTC" firstStartedPulling="2026-04-21 02:47:50.9114517 +0000 UTC m=+438.595995953" lastFinishedPulling="2026-04-21 02:47:54.018078861 +0000 UTC m=+441.702623115" observedRunningTime="2026-04-21 02:47:54.139416472 +0000 UTC m=+441.823960733" watchObservedRunningTime="2026-04-21 02:47:54.140335423 +0000 UTC m=+441.824879683" Apr 21 02:47:57.137871 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:57.137830 2572 generic.go:358] "Generic (PLEG): container finished" podID="845205e9-4ed6-439b-9e1e-f2f66fecc8f3" containerID="3649ab49df5c013f6893aae988f8c9ef3b6c09fa0f0e9c2409f943089987da26" exitCode=1 Apr 21 02:47:57.138353 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:57.137922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" event={"ID":"845205e9-4ed6-439b-9e1e-f2f66fecc8f3","Type":"ContainerDied","Data":"3649ab49df5c013f6893aae988f8c9ef3b6c09fa0f0e9c2409f943089987da26"} Apr 21 02:47:57.138353 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:57.138155 2572 scope.go:117] "RemoveContainer" containerID="3649ab49df5c013f6893aae988f8c9ef3b6c09fa0f0e9c2409f943089987da26" Apr 21 02:47:58.143119 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.143084 2572 generic.go:358] "Generic (PLEG): container finished" podID="845205e9-4ed6-439b-9e1e-f2f66fecc8f3" containerID="85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985" exitCode=1 Apr 21 02:47:58.143620 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.143176 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" event={"ID":"845205e9-4ed6-439b-9e1e-f2f66fecc8f3","Type":"ContainerDied","Data":"85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985"} Apr 21 02:47:58.143620 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.143203 2572 scope.go:117] "RemoveContainer" containerID="3649ab49df5c013f6893aae988f8c9ef3b6c09fa0f0e9c2409f943089987da26" Apr 21 02:47:58.143620 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.143556 2572 scope.go:117] "RemoveContainer" containerID="85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985" Apr 21 02:47:58.143820 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:58.143798 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-gmv48_opendatahub(845205e9-4ed6-439b-9e1e-f2f66fecc8f3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" podUID="845205e9-4ed6-439b-9e1e-f2f66fecc8f3" Apr 21 02:47:58.447847 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.447812 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-66sb4"] Apr 21 02:47:58.452646 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.452621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.455817 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.455786 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 02:47:58.455999 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.455977 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xslcb\"" Apr 21 02:47:58.456091 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.456076 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 02:47:58.472366 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.472337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-66sb4"] Apr 21 02:47:58.627816 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.627772 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkmc\" (UniqueName: \"kubernetes.io/projected/88add22f-56d1-4b2f-8a4a-5654c5215f6b-kube-api-access-7nkmc\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.627970 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.627882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/88add22f-56d1-4b2f-8a4a-5654c5215f6b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.728835 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.728732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/88add22f-56d1-4b2f-8a4a-5654c5215f6b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.728835 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.728800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkmc\" (UniqueName: \"kubernetes.io/projected/88add22f-56d1-4b2f-8a4a-5654c5215f6b-kube-api-access-7nkmc\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.731405 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.731366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/88add22f-56d1-4b2f-8a4a-5654c5215f6b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.744163 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.744124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkmc\" (UniqueName: \"kubernetes.io/projected/88add22f-56d1-4b2f-8a4a-5654c5215f6b-kube-api-access-7nkmc\") pod \"servicemesh-operator3-55f49c5f94-66sb4\" (UID: \"88add22f-56d1-4b2f-8a4a-5654c5215f6b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.761975 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.761938 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:47:58.893955 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:58.893875 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-66sb4"] Apr 21 02:47:58.896764 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:47:58.896732 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88add22f_56d1_4b2f_8a4a_5654c5215f6b.slice/crio-0908a6134b5131af3037cd7b9dcd22b3cf30da537f9713b86d2f996ff9dfa606 WatchSource:0}: Error finding container 0908a6134b5131af3037cd7b9dcd22b3cf30da537f9713b86d2f996ff9dfa606: Status 404 returned error can't find the container with id 0908a6134b5131af3037cd7b9dcd22b3cf30da537f9713b86d2f996ff9dfa606 Apr 21 02:47:59.148011 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.147928 2572 scope.go:117] "RemoveContainer" containerID="85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985" Apr 21 02:47:59.148450 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:59.148155 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-gmv48_opendatahub(845205e9-4ed6-439b-9e1e-f2f66fecc8f3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" podUID="845205e9-4ed6-439b-9e1e-f2f66fecc8f3" Apr 21 02:47:59.148838 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.148814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" event={"ID":"88add22f-56d1-4b2f-8a4a-5654c5215f6b","Type":"ContainerStarted","Data":"0908a6134b5131af3037cd7b9dcd22b3cf30da537f9713b86d2f996ff9dfa606"} Apr 21 02:47:59.469278 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.469240 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2cqdx"] Apr 21 02:47:59.473916 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.473888 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:47:59.476329 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.476302 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 02:47:59.476664 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.476645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-dcl85\"" Apr 21 02:47:59.483785 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.483386 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2cqdx"] Apr 21 02:47:59.635445 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.635406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:47:59.635648 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.635471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqhg\" (UniqueName: \"kubernetes.io/projected/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-kube-api-access-9xqhg\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:47:59.736474 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.736376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:47:59.736474 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.736435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqhg\" (UniqueName: \"kubernetes.io/projected/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-kube-api-access-9xqhg\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:47:59.736709 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:59.736553 2572 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 02:47:59.736709 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:47:59.736616 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert podName:6151b4cb-ae22-4b41-ab45-86e3ae63e7f4 nodeName:}" failed. No retries permitted until 2026-04-21 02:48:00.236599065 +0000 UTC m=+447.921143310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert") pod "kserve-controller-manager-856948b99f-2cqdx" (UID: "6151b4cb-ae22-4b41-ab45-86e3ae63e7f4") : secret "kserve-webhook-server-cert" not found Apr 21 02:47:59.745429 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:47:59.745394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqhg\" (UniqueName: \"kubernetes.io/projected/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-kube-api-access-9xqhg\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:48:00.241998 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:00.241966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:48:00.244794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:00.244768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6151b4cb-ae22-4b41-ab45-86e3ae63e7f4-cert\") pod \"kserve-controller-manager-856948b99f-2cqdx\" (UID: \"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4\") " pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:48:00.388171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:00.388136 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:48:00.538824 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:00.538786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2cqdx"] Apr 21 02:48:00.541596 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:48:00.541558 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6151b4cb_ae22_4b41_ab45_86e3ae63e7f4.slice/crio-9702d3ab51293d19b91c4531d091ec94a7edfd09433d88343c3e3f732ccf4866 WatchSource:0}: Error finding container 9702d3ab51293d19b91c4531d091ec94a7edfd09433d88343c3e3f732ccf4866: Status 404 returned error can't find the container with id 9702d3ab51293d19b91c4531d091ec94a7edfd09433d88343c3e3f732ccf4866 Apr 21 02:48:01.162665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:01.162621 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" event={"ID":"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4","Type":"ContainerStarted","Data":"9702d3ab51293d19b91c4531d091ec94a7edfd09433d88343c3e3f732ccf4866"} Apr 21 02:48:02.167278 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:02.167234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" event={"ID":"88add22f-56d1-4b2f-8a4a-5654c5215f6b","Type":"ContainerStarted","Data":"0ba9327b23219f9b7a593c0abab6d4a7ca21b2bda50902a39eb61bb3b5608069"} Apr 21 02:48:02.167680 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:02.167310 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:48:02.186576 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:02.186505 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" podStartSLOduration=1.502421717 podStartE2EDuration="4.186465977s" podCreationTimestamp="2026-04-21 02:47:58 +0000 UTC" firstStartedPulling="2026-04-21 02:47:58.899324969 +0000 UTC m=+446.583869212" lastFinishedPulling="2026-04-21 02:48:01.58336922 +0000 UTC m=+449.267913472" observedRunningTime="2026-04-21 02:48:02.185208127 +0000 UTC m=+449.869752390" watchObservedRunningTime="2026-04-21 02:48:02.186465977 +0000 UTC m=+449.871010262" Apr 21 02:48:03.004703 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:03.004659 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:48:03.005123 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:03.005107 2572 scope.go:117] "RemoveContainer" containerID="85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985" Apr 21 02:48:03.005356 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:48:03.005335 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-gmv48_opendatahub(845205e9-4ed6-439b-9e1e-f2f66fecc8f3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" podUID="845205e9-4ed6-439b-9e1e-f2f66fecc8f3" Apr 21 02:48:04.174553 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:04.174437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" event={"ID":"6151b4cb-ae22-4b41-ab45-86e3ae63e7f4","Type":"ContainerStarted","Data":"1599fe8fdcdeb2f550467fc6480dd63ce1d5f742d181ff6b48137eddacbf3aac"} Apr 21 02:48:04.174924 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:04.174580 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:48:04.190251 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:04.190195 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" podStartSLOduration=1.9550081000000001 podStartE2EDuration="5.190178395s" podCreationTimestamp="2026-04-21 02:47:59 +0000 UTC" firstStartedPulling="2026-04-21 02:48:00.543096584 +0000 UTC m=+448.227640825" lastFinishedPulling="2026-04-21 02:48:03.778266865 +0000 UTC m=+451.462811120" observedRunningTime="2026-04-21 02:48:04.189229584 +0000 UTC m=+451.873773838" watchObservedRunningTime="2026-04-21 02:48:04.190178395 +0000 UTC m=+451.874722649" Apr 21 02:48:13.004928 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:13.004883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:48:13.005482 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:13.005388 2572 scope.go:117] "RemoveContainer" containerID="85da211d8a95592bf29ccfd57baaec521171d074968783ff86aeab63ce855985" Apr 21 02:48:13.173113 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:13.173077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-66sb4" Apr 21 02:48:14.205462 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:14.205429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" event={"ID":"845205e9-4ed6-439b-9e1e-f2f66fecc8f3","Type":"ContainerStarted","Data":"677046e639d2cf07e15e22d355ced4b5675c884c83fd6700b46c52048044d94b"} Apr 21 02:48:14.205910 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:14.205648 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:48:14.234743 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:14.234670 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" podStartSLOduration=3.018045786 podStartE2EDuration="23.23464763s" podCreationTimestamp="2026-04-21 02:47:51 +0000 UTC" firstStartedPulling="2026-04-21 02:47:53.146060326 +0000 UTC m=+440.830604578" lastFinishedPulling="2026-04-21 02:48:13.362662179 +0000 UTC m=+461.047206422" observedRunningTime="2026-04-21 02:48:14.227063493 +0000 UTC m=+461.911607750" watchObservedRunningTime="2026-04-21 02:48:14.23464763 +0000 UTC m=+461.919191891" Apr 21 02:48:23.112275 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.112223 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf"] Apr 21 02:48:23.115247 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.115224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.117448 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.117417 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 02:48:23.117592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.117420 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 02:48:23.117592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.117496 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-6qwf9\"" Apr 21 02:48:23.117592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.117517 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 02:48:23.117759 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.117746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 02:48:23.130030 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.129998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf"] Apr 21 02:48:23.208604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208604 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208831 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxsg8\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-kube-api-access-mxsg8\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208831 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208831 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208831 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208764 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.208951 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.208847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.310619 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.310518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.310820 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.310689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.310820 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.310812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.310943 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.310843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxsg8\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-kube-api-access-mxsg8\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.311031 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.311006 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.311089 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.311071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.311143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.311111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.311659 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.311634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.313456 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.313410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.313783 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.313759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.313852 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.313828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.314212 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.314196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.317725 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.317705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.317945 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.317925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxsg8\" (UniqueName: \"kubernetes.io/projected/861f6e92-414d-4fcb-ad9f-07d80cc5e5d2-kube-api-access-mxsg8\") pod \"istiod-openshift-gateway-55ff986f96-cclxf\" (UID: \"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.424023 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.423916 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:23.577504 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:23.577466 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf"] Apr 21 02:48:23.581263 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:48:23.581227 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861f6e92_414d_4fcb_ad9f_07d80cc5e5d2.slice/crio-65f981a5c6a2779ae3d63bdd8c07a3b6950a08ff5511c3b545557600d0b57649 WatchSource:0}: Error finding container 65f981a5c6a2779ae3d63bdd8c07a3b6950a08ff5511c3b545557600d0b57649: Status 404 returned error can't find the container with id 65f981a5c6a2779ae3d63bdd8c07a3b6950a08ff5511c3b545557600d0b57649 Apr 21 02:48:24.236286 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:24.236247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" event={"ID":"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2","Type":"ContainerStarted","Data":"65f981a5c6a2779ae3d63bdd8c07a3b6950a08ff5511c3b545557600d0b57649"} Apr 21 02:48:25.211758 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:25.211725 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-gmv48" Apr 21 02:48:26.112239 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:26.112198 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 21 02:48:26.112489 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:26.112284 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 21 02:48:26.244688 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:26.244626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" event={"ID":"861f6e92-414d-4fcb-ad9f-07d80cc5e5d2","Type":"ContainerStarted","Data":"fd5f8597a02bf6387cfe6059ee03571b57643f06a4c882bde318bc26fa075e4a"} Apr 21 02:48:26.244888 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:26.244757 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:26.265092 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:26.265028 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" podStartSLOduration=0.73693832 podStartE2EDuration="3.265005821s" podCreationTimestamp="2026-04-21 02:48:23 +0000 UTC" firstStartedPulling="2026-04-21 02:48:23.58387803 +0000 UTC m=+471.268422267" lastFinishedPulling="2026-04-21 02:48:26.111945507 +0000 UTC m=+473.796489768" observedRunningTime="2026-04-21 02:48:26.262812734 +0000 UTC m=+473.947356994" watchObservedRunningTime="2026-04-21 02:48:26.265005821 +0000 UTC m=+473.949550082" Apr 21 02:48:27.250377 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:27.250338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cclxf" Apr 21 02:48:35.184489 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:48:35.184457 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-2cqdx" Apr 21 02:49:14.173288 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.173254 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f"] Apr 21 02:49:14.180563 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.180514 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:14.182687 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.182666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 02:49:14.182794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.182665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 02:49:14.182794 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.182666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 02:49:14.183257 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.183243 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-lnlnb\"" Apr 21 02:49:14.187137 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.187110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f"] Apr 21 02:49:14.316450 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.316415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgfs\" (UniqueName: \"kubernetes.io/projected/d1e291af-d9ee-49ef-895b-be5a6e705eaf-kube-api-access-csgfs\") pod \"dns-operator-controller-manager-648d5c98bc-96b6f\" (UID: \"d1e291af-d9ee-49ef-895b-be5a6e705eaf\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:14.416902 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.416856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csgfs\" (UniqueName: \"kubernetes.io/projected/d1e291af-d9ee-49ef-895b-be5a6e705eaf-kube-api-access-csgfs\") pod \"dns-operator-controller-manager-648d5c98bc-96b6f\" (UID: \"d1e291af-d9ee-49ef-895b-be5a6e705eaf\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:14.426278 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.426211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgfs\" (UniqueName: \"kubernetes.io/projected/d1e291af-d9ee-49ef-895b-be5a6e705eaf-kube-api-access-csgfs\") pod \"dns-operator-controller-manager-648d5c98bc-96b6f\" (UID: \"d1e291af-d9ee-49ef-895b-be5a6e705eaf\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:14.491217 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.491165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:14.615955 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:14.615918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f"] Apr 21 02:49:14.619651 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:49:14.619620 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1e291af_d9ee_49ef_895b_be5a6e705eaf.slice/crio-d3ccc9061ed9069d8c4513bd9c7b18ee88b1248141da45564ca9bd2415693f98 WatchSource:0}: Error finding container d3ccc9061ed9069d8c4513bd9c7b18ee88b1248141da45564ca9bd2415693f98: Status 404 returned error can't find the container with id d3ccc9061ed9069d8c4513bd9c7b18ee88b1248141da45564ca9bd2415693f98 Apr 21 02:49:15.397831 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:15.397793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" event={"ID":"d1e291af-d9ee-49ef-895b-be5a6e705eaf","Type":"ContainerStarted","Data":"d3ccc9061ed9069d8c4513bd9c7b18ee88b1248141da45564ca9bd2415693f98"} Apr 21 02:49:17.405677 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.405639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" event={"ID":"d1e291af-d9ee-49ef-895b-be5a6e705eaf","Type":"ContainerStarted","Data":"c41f286ceab89b255d62c08311e4cd2bf01ee18583feb3afafc2675a338ea967"} Apr 21 02:49:17.406106 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.405749 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:17.422689 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.422635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" podStartSLOduration=1.4901785539999999 podStartE2EDuration="3.422617161s" podCreationTimestamp="2026-04-21 02:49:14 +0000 UTC" firstStartedPulling="2026-04-21 02:49:14.621670393 +0000 UTC m=+522.306214634" lastFinishedPulling="2026-04-21 02:49:16.554109001 +0000 UTC m=+524.238653241" observedRunningTime="2026-04-21 02:49:17.421688644 +0000 UTC m=+525.106232905" watchObservedRunningTime="2026-04-21 02:49:17.422617161 +0000 UTC m=+525.107161421" Apr 21 02:49:17.578286 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.578250 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz"] Apr 21 02:49:17.581478 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.581454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:17.583786 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.583759 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-sqn25\"" Apr 21 02:49:17.589647 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.589600 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz"] Apr 21 02:49:17.745482 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.745441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r5cl\" (UniqueName: \"kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl\") pod \"limitador-operator-controller-manager-85c4996f8c-snhbz\" (UID: \"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:17.846693 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.846650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r5cl\" (UniqueName: \"kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl\") pod \"limitador-operator-controller-manager-85c4996f8c-snhbz\" (UID: \"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:17.854830 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.854803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r5cl\" (UniqueName: \"kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl\") pod \"limitador-operator-controller-manager-85c4996f8c-snhbz\" (UID: \"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:17.893295 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:17.893253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:18.023678 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:18.023489 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz"] Apr 21 02:49:18.026717 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:49:18.026683 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a06f9aa_981f_4e75_8a6f_6879c6fd8cef.slice/crio-a3d41e7165b4b18b1766da5ebbb8b2de83bd19bebf35b5f63b4926225add0382 WatchSource:0}: Error finding container a3d41e7165b4b18b1766da5ebbb8b2de83bd19bebf35b5f63b4926225add0382: Status 404 returned error can't find the container with id a3d41e7165b4b18b1766da5ebbb8b2de83bd19bebf35b5f63b4926225add0382 Apr 21 02:49:18.409916 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:18.409828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" event={"ID":"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef","Type":"ContainerStarted","Data":"a3d41e7165b4b18b1766da5ebbb8b2de83bd19bebf35b5f63b4926225add0382"} Apr 21 02:49:20.418909 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:20.418818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" event={"ID":"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef","Type":"ContainerStarted","Data":"b12204517c7f9004d3162775019f992363e8c03ad2598ebc0583ac4eb0a44494"} Apr 21 02:49:20.418909 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:20.418883 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:20.435981 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:20.435921 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" podStartSLOduration=1.454000245 podStartE2EDuration="3.435904656s" podCreationTimestamp="2026-04-21 02:49:17 +0000 UTC" firstStartedPulling="2026-04-21 02:49:18.028695575 +0000 UTC m=+525.713239815" lastFinishedPulling="2026-04-21 02:49:20.010599987 +0000 UTC m=+527.695144226" observedRunningTime="2026-04-21 02:49:20.434228797 +0000 UTC m=+528.118773057" watchObservedRunningTime="2026-04-21 02:49:20.435904656 +0000 UTC m=+528.120448916" Apr 21 02:49:28.412345 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:28.412263 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-96b6f" Apr 21 02:49:31.425612 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:31.425578 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:40.632923 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.632885 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g"] Apr 21 02:49:40.635930 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.635913 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.638256 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.638234 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-tkql6\"" Apr 21 02:49:40.646967 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.646937 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g"] Apr 21 02:49:40.716592 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.716544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.716768 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.716624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pxh\" (UniqueName: \"kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.817842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.817808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.818010 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.817866 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pxh\" (UniqueName: \"kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.818269 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.818248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.833450 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.833424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pxh\" (UniqueName: \"kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:40.946769 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:40.946736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:41.074497 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.074464 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g"] Apr 21 02:49:41.078017 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:49:41.077987 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f4c489_8c75_4805_910e_48546164809d.slice/crio-9d8df3bbc954516a37c2c5adb5c2ed0703088b0e88fcd8a47eed33167b97f08e WatchSource:0}: Error finding container 9d8df3bbc954516a37c2c5adb5c2ed0703088b0e88fcd8a47eed33167b97f08e: Status 404 returned error can't find the container with id 9d8df3bbc954516a37c2c5adb5c2ed0703088b0e88fcd8a47eed33167b97f08e Apr 21 02:49:41.321807 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.321707 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g"] Apr 21 02:49:41.327621 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.327587 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g"] Apr 21 02:49:41.341403 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.341370 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz"] Apr 21 02:49:41.341662 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.341624 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerName="manager" containerID="cri-o://b12204517c7f9004d3162775019f992363e8c03ad2598ebc0583ac4eb0a44494" gracePeriod=2 Apr 21 02:49:41.345253 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.345223 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:49:41.351389 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.351360 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.360386 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.360334 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:49:41.362111 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.362084 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz"] Apr 21 02:49:41.422486 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.422459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.422665 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.422492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmfj\" (UniqueName: \"kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.486174 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.486140 2572 generic.go:358] "Generic (PLEG): container finished" podID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerID="b12204517c7f9004d3162775019f992363e8c03ad2598ebc0583ac4eb0a44494" exitCode=0 Apr 21 02:49:41.523245 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.523200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.523415 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.523268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmfj\" (UniqueName: \"kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.523631 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.523608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.532902 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.532868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmfj\" (UniqueName: \"kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.676388 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.676357 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:41.678327 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.678286 2572 status_manager.go:895] "Failed to get status for pod" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" err="pods \"limitador-operator-controller-manager-85c4996f8c-snhbz\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:41.718386 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.718353 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:41.724504 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.724474 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r5cl\" (UniqueName: \"kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl\") pod \"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef\" (UID: \"7a06f9aa-981f-4e75-8a6f-6879c6fd8cef\") " Apr 21 02:49:41.726958 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.726924 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl" (OuterVolumeSpecName: "kube-api-access-4r5cl") pod "7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" (UID: "7a06f9aa-981f-4e75-8a6f-6879c6fd8cef"). InnerVolumeSpecName "kube-api-access-4r5cl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:49:41.825849 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:41.825800 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4r5cl\" (UniqueName: \"kubernetes.io/projected/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef-kube-api-access-4r5cl\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:49:42.070465 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.070408 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:49:42.101403 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:49:42.101363 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18ea2c9_e34a_492d_a1aa_d9bdc51993bf.slice/crio-65b18daf3f1dea068a6f760b50e34dc7c1b264700317d8690ed4d6a17caaf371 WatchSource:0}: Error finding container 65b18daf3f1dea068a6f760b50e34dc7c1b264700317d8690ed4d6a17caaf371: Status 404 returned error can't find the container with id 65b18daf3f1dea068a6f760b50e34dc7c1b264700317d8690ed4d6a17caaf371 Apr 21 02:49:42.424414 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.424356 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerName="manager" probeResult="failure" output="Get \"http://10.132.0.21:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 02:49:42.491437 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.491383 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" Apr 21 02:49:42.491437 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.491426 2572 scope.go:117] "RemoveContainer" containerID="b12204517c7f9004d3162775019f992363e8c03ad2598ebc0583ac4eb0a44494" Apr 21 02:49:42.492942 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.492907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" event={"ID":"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf","Type":"ContainerStarted","Data":"65b18daf3f1dea068a6f760b50e34dc7c1b264700317d8690ed4d6a17caaf371"} Apr 21 02:49:42.493661 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.493628 2572 status_manager.go:895] "Failed to get status for pod" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" err="pods \"limitador-operator-controller-manager-85c4996f8c-snhbz\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:42.503768 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.503732 2572 status_manager.go:895] "Failed to get status for pod" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" err="pods \"limitador-operator-controller-manager-85c4996f8c-snhbz\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:42.911480 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.911140 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" path="/var/lib/kubelet/pods/7a06f9aa-981f-4e75-8a6f-6879c6fd8cef/volumes" Apr 21 02:49:42.930658 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:42.930579 2572 status_manager.go:895] "Failed to get status for pod" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-snhbz" err="pods \"limitador-operator-controller-manager-85c4996f8c-snhbz\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:45.505881 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.505843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" event={"ID":"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf","Type":"ContainerStarted","Data":"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d"} Apr 21 02:49:45.506348 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.505974 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:49:45.507301 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.507247 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" podUID="23f4c489-8c75-4805-910e-48546164809d" containerName="manager" containerID="cri-o://1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530" gracePeriod=2 Apr 21 02:49:45.524350 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.524286 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" podStartSLOduration=1.861642679 podStartE2EDuration="4.524264722s" podCreationTimestamp="2026-04-21 02:49:41 +0000 UTC" firstStartedPulling="2026-04-21 02:49:42.104388995 +0000 UTC m=+549.788933236" lastFinishedPulling="2026-04-21 02:49:44.767011027 +0000 UTC m=+552.451555279" observedRunningTime="2026-04-21 02:49:45.522251449 +0000 UTC m=+553.206795709" watchObservedRunningTime="2026-04-21 02:49:45.524264722 +0000 UTC m=+553.208808983" Apr 21 02:49:45.744501 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.744479 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:45.746264 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.746237 2572 status_manager.go:895] "Failed to get status for pod" podUID="23f4c489-8c75-4805-910e-48546164809d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:45.758691 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.758631 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume\") pod \"23f4c489-8c75-4805-910e-48546164809d\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " Apr 21 02:49:45.758780 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.758717 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pxh\" (UniqueName: \"kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh\") pod \"23f4c489-8c75-4805-910e-48546164809d\" (UID: \"23f4c489-8c75-4805-910e-48546164809d\") " Apr 21 02:49:45.758943 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.758917 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "23f4c489-8c75-4805-910e-48546164809d" (UID: "23f4c489-8c75-4805-910e-48546164809d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:49:45.760739 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.760703 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh" (OuterVolumeSpecName: "kube-api-access-s4pxh") pod "23f4c489-8c75-4805-910e-48546164809d" (UID: "23f4c489-8c75-4805-910e-48546164809d"). InnerVolumeSpecName "kube-api-access-s4pxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:49:45.859941 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.859901 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4pxh\" (UniqueName: \"kubernetes.io/projected/23f4c489-8c75-4805-910e-48546164809d-kube-api-access-s4pxh\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:49:45.859941 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:45.859928 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/23f4c489-8c75-4805-910e-48546164809d-extensions-socket-volume\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:49:46.511041 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.511003 2572 generic.go:358] "Generic (PLEG): container finished" podID="23f4c489-8c75-4805-910e-48546164809d" containerID="1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530" exitCode=2 Apr 21 02:49:46.511501 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.511049 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" Apr 21 02:49:46.511501 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.511057 2572 scope.go:117] "RemoveContainer" containerID="1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530" Apr 21 02:49:46.512915 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.512884 2572 status_manager.go:895] "Failed to get status for pod" podUID="23f4c489-8c75-4805-910e-48546164809d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:46.519621 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.519602 2572 scope.go:117] "RemoveContainer" containerID="1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530" Apr 21 02:49:46.519882 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:49:46.519860 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530\": container with ID starting with 1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530 not found: ID does not exist" containerID="1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530" Apr 21 02:49:46.519939 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.519891 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530"} err="failed to get container status \"1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530\": rpc error: code = NotFound desc = could not find container \"1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530\": container with ID starting with 1b76e7802e37f0c2afccfb8efb96bbfbc835ec9b122654881baa05e059da3530 not found: ID does not exist" Apr 21 02:49:46.521006 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.520983 2572 status_manager.go:895] "Failed to get status for pod" podUID="23f4c489-8c75-4805-910e-48546164809d" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-qzf6g\" is forbidden: User \"system:node:ip-10-0-137-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-147.ec2.internal' and this object" Apr 21 02:49:46.909835 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:46.909751 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f4c489-8c75-4805-910e-48546164809d" path="/var/lib/kubelet/pods/23f4c489-8c75-4805-910e-48546164809d/volumes" Apr 21 02:49:56.513489 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:49:56.513458 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:50:09.902550 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:09.902431 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:50:09.902992 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:09.902713 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" podUID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" containerName="manager" containerID="cri-o://f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d" gracePeriod=10 Apr 21 02:50:10.144311 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.144288 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:50:10.257308 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.257268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjmfj\" (UniqueName: \"kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj\") pod \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " Apr 21 02:50:10.257477 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.257335 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume\") pod \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\" (UID: \"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf\") " Apr 21 02:50:10.257857 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.257831 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" (UID: "b18ea2c9-e34a-492d-a1aa-d9bdc51993bf"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 02:50:10.259418 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.259394 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj" (OuterVolumeSpecName: "kube-api-access-pjmfj") pod "b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" (UID: "b18ea2c9-e34a-492d-a1aa-d9bdc51993bf"). InnerVolumeSpecName "kube-api-access-pjmfj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:50:10.358065 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.358025 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-extensions-socket-volume\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:50:10.358065 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.358059 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjmfj\" (UniqueName: \"kubernetes.io/projected/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf-kube-api-access-pjmfj\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:50:10.592842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.592752 2572 generic.go:358] "Generic (PLEG): container finished" podID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" containerID="f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d" exitCode=0 Apr 21 02:50:10.592842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.592799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" event={"ID":"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf","Type":"ContainerDied","Data":"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d"} Apr 21 02:50:10.592842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.592811 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" Apr 21 02:50:10.592842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.592823 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg" event={"ID":"b18ea2c9-e34a-492d-a1aa-d9bdc51993bf","Type":"ContainerDied","Data":"65b18daf3f1dea068a6f760b50e34dc7c1b264700317d8690ed4d6a17caaf371"} Apr 21 02:50:10.592842 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.592838 2572 scope.go:117] "RemoveContainer" containerID="f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d" Apr 21 02:50:10.600915 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.600890 2572 scope.go:117] "RemoveContainer" containerID="f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d" Apr 21 02:50:10.601143 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:50:10.601126 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d\": container with ID starting with f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d not found: ID does not exist" containerID="f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d" Apr 21 02:50:10.601192 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.601152 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d"} err="failed to get container status \"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d\": rpc error: code = NotFound desc = could not find container \"f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d\": container with ID starting with f51acdde94a607e86ba47d9103b4654168a532160e685454f94589313f2eb72d not found: ID does not exist" Apr 21 02:50:10.615073 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.615047 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:50:10.619508 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.619487 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6tlfg"] Apr 21 02:50:10.909784 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:10.909710 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" path="/var/lib/kubelet/pods/b18ea2c9-e34a-492d-a1aa-d9bdc51993bf/volumes" Apr 21 02:50:12.800327 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800293 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z"] Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800606 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800618 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800634 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800640 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800646 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f4c489-8c75-4805-910e-48546164809d" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800652 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f4c489-8c75-4805-910e-48546164809d" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800698 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a06f9aa-981f-4e75-8a6f-6879c6fd8cef" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800708 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f4c489-8c75-4805-910e-48546164809d" containerName="manager" Apr 21 02:50:12.800801 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.800714 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b18ea2c9-e34a-492d-a1aa-d9bdc51993bf" containerName="manager" Apr 21 02:50:12.804951 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.804924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.807691 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.807655 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-2544j\"" Apr 21 02:50:12.816992 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.816957 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z"] Apr 21 02:50:12.880818 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.880986 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880829 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0918a43b-f702-44ce-ad65-0d11bcadfd54-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.880986 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.880986 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.880986 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880918 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.881120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.880989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.881120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.881045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.881120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.881065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.881120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.881081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlz4b\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-kube-api-access-nlz4b\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981589 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlz4b\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-kube-api-access-nlz4b\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0918a43b-f702-44ce-ad65-0d11bcadfd54-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.981778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982043 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982043 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982043 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.981870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982196 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.982087 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982251 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.982209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982497 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.982474 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.982663 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.982640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.983076 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.983056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0918a43b-f702-44ce-ad65-0d11bcadfd54-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.984792 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.984775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.985088 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.985067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.989084 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.989061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:12.989179 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:12.989133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlz4b\" (UniqueName: \"kubernetes.io/projected/0918a43b-f702-44ce-ad65-0d11bcadfd54-kube-api-access-nlz4b\") pod \"maas-default-gateway-openshift-default-845c6b4b48-m5n8z\" (UID: \"0918a43b-f702-44ce-ad65-0d11bcadfd54\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:13.121784 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:13.121687 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:13.254129 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:13.254093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z"] Apr 21 02:50:13.258086 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:50:13.258040 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0918a43b_f702_44ce_ad65_0d11bcadfd54.slice/crio-b8297eff08f5df2b6ab709a5b1ca364853c30b564c2d210fb193598c7a922996 WatchSource:0}: Error finding container b8297eff08f5df2b6ab709a5b1ca364853c30b564c2d210fb193598c7a922996: Status 404 returned error can't find the container with id b8297eff08f5df2b6ab709a5b1ca364853c30b564c2d210fb193598c7a922996 Apr 21 02:50:13.603393 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:13.603361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" event={"ID":"0918a43b-f702-44ce-ad65-0d11bcadfd54","Type":"ContainerStarted","Data":"b8297eff08f5df2b6ab709a5b1ca364853c30b564c2d210fb193598c7a922996"} Apr 21 02:50:31.006688 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:31.006646 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 21 02:50:31.006928 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:31.006740 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 21 02:50:31.006928 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:31.006774 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 21 02:50:31.664887 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:31.664851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" event={"ID":"0918a43b-f702-44ce-ad65-0d11bcadfd54","Type":"ContainerStarted","Data":"7dda23970d277e5528541b8c2d5ca4f34627feed256a9cebb644985dc6c9a579"} Apr 21 02:50:31.683982 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:31.683934 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" podStartSLOduration=1.937554872 podStartE2EDuration="19.683918984s" podCreationTimestamp="2026-04-21 02:50:12 +0000 UTC" firstStartedPulling="2026-04-21 02:50:13.259986543 +0000 UTC m=+580.944530786" lastFinishedPulling="2026-04-21 02:50:31.006350646 +0000 UTC m=+598.690894898" observedRunningTime="2026-04-21 02:50:31.683096695 +0000 UTC m=+599.367640963" watchObservedRunningTime="2026-04-21 02:50:31.683918984 +0000 UTC m=+599.368463243" Apr 21 02:50:32.122274 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.122238 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:32.126946 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.126921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:32.181785 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.181752 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:32.184946 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.184926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.187299 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.187276 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9hwh5\"" Apr 21 02:50:32.187382 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.187281 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 02:50:32.192362 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.192341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:32.282676 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.282642 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:32.343532 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.343500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.343676 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.343564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lw7\" (UniqueName: \"kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.444051 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.444022 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.444196 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.444069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2lw7\" (UniqueName: \"kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.444658 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.444637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.456313 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.456283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2lw7\" (UniqueName: \"kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7\") pod \"limitador-limitador-7d549b5b-8lzcc\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.495713 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.495682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:32.613089 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.613056 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:32.616043 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:50:32.616016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2e91c1_3958_4fa1_be6f_b7fba031160f.slice/crio-af58bf52179c74f18785b1d8fa7ea6bede80ea0ed0fdcc4548e2bf2a4dd53bfc WatchSource:0}: Error finding container af58bf52179c74f18785b1d8fa7ea6bede80ea0ed0fdcc4548e2bf2a4dd53bfc: Status 404 returned error can't find the container with id af58bf52179c74f18785b1d8fa7ea6bede80ea0ed0fdcc4548e2bf2a4dd53bfc Apr 21 02:50:32.647143 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.647117 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:50:32.651623 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.651604 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.657076 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.657048 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:50:32.670127 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.670097 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" event={"ID":"ae2e91c1-3958-4fa1-be6f-b7fba031160f","Type":"ContainerStarted","Data":"af58bf52179c74f18785b1d8fa7ea6bede80ea0ed0fdcc4548e2bf2a4dd53bfc"} Apr 21 02:50:32.670323 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.670307 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:32.671259 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.671241 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-m5n8z" Apr 21 02:50:32.686696 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.686668 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:50:32.845920 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.845895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nb99\" (UniqueName: \"kubernetes.io/projected/ee158c03-fdd4-4786-add4-1a851173ca55-kube-api-access-6nb99\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.846016 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.845996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ee158c03-fdd4-4786-add4-1a851173ca55-config-file\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.946838 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.946801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nb99\" (UniqueName: \"kubernetes.io/projected/ee158c03-fdd4-4786-add4-1a851173ca55-kube-api-access-6nb99\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.947000 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.946868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ee158c03-fdd4-4786-add4-1a851173ca55-config-file\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.947400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.947381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ee158c03-fdd4-4786-add4-1a851173ca55-config-file\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.961018 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.960992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nb99\" (UniqueName: \"kubernetes.io/projected/ee158c03-fdd4-4786-add4-1a851173ca55-kube-api-access-6nb99\") pod \"limitador-limitador-78c99df468-2bmfc\" (UID: \"ee158c03-fdd4-4786-add4-1a851173ca55\") " pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:32.962678 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:32.962652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:33.094122 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:33.094101 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:50:33.096197 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:50:33.096172 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee158c03_fdd4_4786_add4_1a851173ca55.slice/crio-6536c57e0a57f9e804a9d6c3dbd4e50be941b64d3da692b389f05e6aa55e55cc WatchSource:0}: Error finding container 6536c57e0a57f9e804a9d6c3dbd4e50be941b64d3da692b389f05e6aa55e55cc: Status 404 returned error can't find the container with id 6536c57e0a57f9e804a9d6c3dbd4e50be941b64d3da692b389f05e6aa55e55cc Apr 21 02:50:33.677163 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:33.677132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" event={"ID":"ee158c03-fdd4-4786-add4-1a851173ca55","Type":"ContainerStarted","Data":"6536c57e0a57f9e804a9d6c3dbd4e50be941b64d3da692b389f05e6aa55e55cc"} Apr 21 02:50:36.688493 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.688457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" event={"ID":"ee158c03-fdd4-4786-add4-1a851173ca55","Type":"ContainerStarted","Data":"83ea9003f6ca899588a99b40c8c9fdde832fd9c64b64e2a12f8e88fcb7a7260c"} Apr 21 02:50:36.688960 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.688554 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:36.689744 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.689718 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" event={"ID":"ae2e91c1-3958-4fa1-be6f-b7fba031160f","Type":"ContainerStarted","Data":"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef"} Apr 21 02:50:36.689866 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.689856 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:36.703283 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.703240 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" podStartSLOduration=2.153168873 podStartE2EDuration="4.703224455s" podCreationTimestamp="2026-04-21 02:50:32 +0000 UTC" firstStartedPulling="2026-04-21 02:50:33.098069313 +0000 UTC m=+600.782613551" lastFinishedPulling="2026-04-21 02:50:35.648124895 +0000 UTC m=+603.332669133" observedRunningTime="2026-04-21 02:50:36.702577868 +0000 UTC m=+604.387122127" watchObservedRunningTime="2026-04-21 02:50:36.703224455 +0000 UTC m=+604.387768716" Apr 21 02:50:36.718868 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:36.718820 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" podStartSLOduration=1.6957036890000001 podStartE2EDuration="4.718809028s" podCreationTimestamp="2026-04-21 02:50:32 +0000 UTC" firstStartedPulling="2026-04-21 02:50:32.617999794 +0000 UTC m=+600.302544036" lastFinishedPulling="2026-04-21 02:50:35.641105134 +0000 UTC m=+603.325649375" observedRunningTime="2026-04-21 02:50:36.716649355 +0000 UTC m=+604.401193614" watchObservedRunningTime="2026-04-21 02:50:36.718809028 +0000 UTC m=+604.403353287" Apr 21 02:50:47.693127 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:47.693097 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-2bmfc" Apr 21 02:50:47.693679 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:47.693148 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:47.756336 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:47.756296 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:47.756558 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:47.756504 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" podUID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" containerName="limitador" containerID="cri-o://d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef" gracePeriod=30 Apr 21 02:50:48.291031 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.291003 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:48.363321 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.363239 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file\") pod \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " Apr 21 02:50:48.363321 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.363306 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2lw7\" (UniqueName: \"kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7\") pod \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\" (UID: \"ae2e91c1-3958-4fa1-be6f-b7fba031160f\") " Apr 21 02:50:48.363649 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.363625 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file" (OuterVolumeSpecName: "config-file") pod "ae2e91c1-3958-4fa1-be6f-b7fba031160f" (UID: "ae2e91c1-3958-4fa1-be6f-b7fba031160f"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:50:48.365298 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.365278 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7" (OuterVolumeSpecName: "kube-api-access-v2lw7") pod "ae2e91c1-3958-4fa1-be6f-b7fba031160f" (UID: "ae2e91c1-3958-4fa1-be6f-b7fba031160f"). InnerVolumeSpecName "kube-api-access-v2lw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:50:48.464187 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.464152 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2lw7\" (UniqueName: \"kubernetes.io/projected/ae2e91c1-3958-4fa1-be6f-b7fba031160f-kube-api-access-v2lw7\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:50:48.464187 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.464183 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ae2e91c1-3958-4fa1-be6f-b7fba031160f-config-file\") on node \"ip-10-0-137-147.ec2.internal\" DevicePath \"\"" Apr 21 02:50:48.727182 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.727147 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" containerID="d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef" exitCode=0 Apr 21 02:50:48.727606 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.727217 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" Apr 21 02:50:48.727606 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.727219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" event={"ID":"ae2e91c1-3958-4fa1-be6f-b7fba031160f","Type":"ContainerDied","Data":"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef"} Apr 21 02:50:48.727606 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.727317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8lzcc" event={"ID":"ae2e91c1-3958-4fa1-be6f-b7fba031160f","Type":"ContainerDied","Data":"af58bf52179c74f18785b1d8fa7ea6bede80ea0ed0fdcc4548e2bf2a4dd53bfc"} Apr 21 02:50:48.727606 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.727332 2572 scope.go:117] "RemoveContainer" containerID="d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef" Apr 21 02:50:48.735134 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.735113 2572 scope.go:117] "RemoveContainer" containerID="d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef" Apr 21 02:50:48.735391 ip-10-0-137-147 kubenswrapper[2572]: E0421 02:50:48.735372 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef\": container with ID starting with d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef not found: ID does not exist" containerID="d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef" Apr 21 02:50:48.735442 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.735400 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef"} err="failed to get container status \"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef\": rpc error: code = NotFound desc = could not find container \"d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef\": container with ID starting with d4a8a7e3cc32c3aedf42049d1a12670478d236576a4b8d7192743be54b7d05ef not found: ID does not exist" Apr 21 02:50:48.749875 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.749850 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:48.753678 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.753659 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8lzcc"] Apr 21 02:50:48.908912 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:50:48.908869 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" path="/var/lib/kubelet/pods/ae2e91c1-3958-4fa1-be6f-b7fba031160f/volumes" Apr 21 02:51:20.063508 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.063470 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mqqqx"] Apr 21 02:51:20.064268 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.063831 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" containerName="limitador" Apr 21 02:51:20.064268 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.063848 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" containerName="limitador" Apr 21 02:51:20.064268 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.063902 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae2e91c1-3958-4fa1-be6f-b7fba031160f" containerName="limitador" Apr 21 02:51:20.070318 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.070297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" Apr 21 02:51:20.073113 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.073085 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:51:20.073259 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.073210 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-g575r\"" Apr 21 02:51:20.073259 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.073243 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:51:20.074470 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.074442 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mqqqx"] Apr 21 02:51:20.224355 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.224303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2t5\" (UniqueName: \"kubernetes.io/projected/9360b100-6180-43f7-95f5-1d487e9a919a-kube-api-access-rn2t5\") pod \"keycloak-operator-5c4df598dd-mqqqx\" (UID: \"9360b100-6180-43f7-95f5-1d487e9a919a\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" Apr 21 02:51:20.325757 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.325666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2t5\" (UniqueName: \"kubernetes.io/projected/9360b100-6180-43f7-95f5-1d487e9a919a-kube-api-access-rn2t5\") pod \"keycloak-operator-5c4df598dd-mqqqx\" (UID: \"9360b100-6180-43f7-95f5-1d487e9a919a\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" Apr 21 02:51:20.333633 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.333594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2t5\" (UniqueName: \"kubernetes.io/projected/9360b100-6180-43f7-95f5-1d487e9a919a-kube-api-access-rn2t5\") pod \"keycloak-operator-5c4df598dd-mqqqx\" (UID: \"9360b100-6180-43f7-95f5-1d487e9a919a\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" Apr 21 02:51:20.382106 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.382070 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" Apr 21 02:51:20.505713 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.505691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mqqqx"] Apr 21 02:51:20.508441 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:51:20.508406 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9360b100_6180_43f7_95f5_1d487e9a919a.slice/crio-971faf9c6d11ad01c193ed295db78b4a0be187defeb0e2fe77b807a3f3ac5445 WatchSource:0}: Error finding container 971faf9c6d11ad01c193ed295db78b4a0be187defeb0e2fe77b807a3f3ac5445: Status 404 returned error can't find the container with id 971faf9c6d11ad01c193ed295db78b4a0be187defeb0e2fe77b807a3f3ac5445 Apr 21 02:51:20.835024 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:20.834992 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" event={"ID":"9360b100-6180-43f7-95f5-1d487e9a919a","Type":"ContainerStarted","Data":"971faf9c6d11ad01c193ed295db78b4a0be187defeb0e2fe77b807a3f3ac5445"} Apr 21 02:51:26.857926 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:26.857841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" event={"ID":"9360b100-6180-43f7-95f5-1d487e9a919a","Type":"ContainerStarted","Data":"2b66b9540534d2e6feb66484ae9889dee2afdc36ec7cb8c6f14a67c0089ad9a8"} Apr 21 02:51:26.872504 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:51:26.872453 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-mqqqx" podStartSLOduration=0.810509884 podStartE2EDuration="6.872437794s" podCreationTimestamp="2026-04-21 02:51:20 +0000 UTC" firstStartedPulling="2026-04-21 02:51:20.50988113 +0000 UTC m=+648.194425371" lastFinishedPulling="2026-04-21 02:51:26.571809031 +0000 UTC m=+654.256353281" observedRunningTime="2026-04-21 02:51:26.870759625 +0000 UTC m=+654.555303887" watchObservedRunningTime="2026-04-21 02:51:26.872437794 +0000 UTC m=+654.556982053" Apr 21 02:52:16.511385 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:52:16.511352 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:53:30.303864 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:30.303828 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:53:35.776251 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:35.776213 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:53:36.474368 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.474338 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9"] Apr 21 02:53:36.477566 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.477549 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.480146 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.480124 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 02:53:36.480274 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.480159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 02:53:36.480340 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.480318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-275zm\"" Apr 21 02:53:36.481086 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.481069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 02:53:36.487976 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.487955 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9"] Apr 21 02:53:36.648823 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.648795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.649012 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.648830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.649012 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.648854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mws5s\" (UniqueName: \"kubernetes.io/projected/e337440f-acdb-426c-9a66-317ec3843ff5-kube-api-access-mws5s\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.649012 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.648935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.649012 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.648993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.649171 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.649034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e337440f-acdb-426c-9a66-317ec3843ff5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mws5s\" (UniqueName: \"kubernetes.io/projected/e337440f-acdb-426c-9a66-317ec3843ff5-kube-api-access-mws5s\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750376 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750721 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750721 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e337440f-acdb-426c-9a66-317ec3843ff5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750829 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750829 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.750918 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.750834 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.752762 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.752725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e337440f-acdb-426c-9a66-317ec3843ff5-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.753033 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.753014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e337440f-acdb-426c-9a66-317ec3843ff5-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.757400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.757376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mws5s\" (UniqueName: \"kubernetes.io/projected/e337440f-acdb-426c-9a66-317ec3843ff5-kube-api-access-mws5s\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9\" (UID: \"e337440f-acdb-426c-9a66-317ec3843ff5\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.789884 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.789858 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:36.911353 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.911327 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9"] Apr 21 02:53:36.914158 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:53:36.914131 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode337440f_acdb_426c_9a66_317ec3843ff5.slice/crio-a9e491b6f0473777f5ca4114aade45f2e8fd9c857d728974edc2453cf0ebd624 WatchSource:0}: Error finding container a9e491b6f0473777f5ca4114aade45f2e8fd9c857d728974edc2453cf0ebd624: Status 404 returned error can't find the container with id a9e491b6f0473777f5ca4114aade45f2e8fd9c857d728974edc2453cf0ebd624 Apr 21 02:53:36.916031 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:36.916012 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:53:37.288337 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:37.288300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" event={"ID":"e337440f-acdb-426c-9a66-317ec3843ff5","Type":"ContainerStarted","Data":"a9e491b6f0473777f5ca4114aade45f2e8fd9c857d728974edc2453cf0ebd624"} Apr 21 02:53:38.079298 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:38.079257 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:53:42.310853 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:42.310816 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" event={"ID":"e337440f-acdb-426c-9a66-317ec3843ff5","Type":"ContainerStarted","Data":"964f637850a91eff5493e33d56b91d1d799db1746ec35ec61c471af30c22fa31"} Apr 21 02:53:48.333223 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:48.331883 2572 generic.go:358] "Generic (PLEG): container finished" podID="e337440f-acdb-426c-9a66-317ec3843ff5" containerID="964f637850a91eff5493e33d56b91d1d799db1746ec35ec61c471af30c22fa31" exitCode=0 Apr 21 02:53:48.333223 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:48.331942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" event={"ID":"e337440f-acdb-426c-9a66-317ec3843ff5","Type":"ContainerDied","Data":"964f637850a91eff5493e33d56b91d1d799db1746ec35ec61c471af30c22fa31"} Apr 21 02:53:50.344962 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:50.344924 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" event={"ID":"e337440f-acdb-426c-9a66-317ec3843ff5","Type":"ContainerStarted","Data":"f5095dfbbda024351b757701e70b94041e09f7d461bfa1626be479a9179e0af2"} Apr 21 02:53:50.345347 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:50.345130 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:53:50.361400 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:53:50.361345 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" podStartSLOduration=1.963133102 podStartE2EDuration="14.361328222s" podCreationTimestamp="2026-04-21 02:53:36 +0000 UTC" firstStartedPulling="2026-04-21 02:53:36.916138393 +0000 UTC m=+784.600682635" lastFinishedPulling="2026-04-21 02:53:49.314333518 +0000 UTC m=+796.998877755" observedRunningTime="2026-04-21 02:53:50.360308616 +0000 UTC m=+798.044852876" watchObservedRunningTime="2026-04-21 02:53:50.361328222 +0000 UTC m=+798.045872484" Apr 21 02:54:01.366269 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:01.366177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9" Apr 21 02:54:17.880785 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.880744 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk"] Apr 21 02:54:17.909770 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.909728 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk"] Apr 21 02:54:17.909914 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.909871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.912271 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.912245 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 02:54:17.979852 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.979815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2g6g\" (UniqueName: \"kubernetes.io/projected/1f594c02-5b77-4cb8-8cf9-b408800895d0-kube-api-access-b2g6g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.980045 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.979869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f594c02-5b77-4cb8-8cf9-b408800895d0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.980045 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.979952 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.980045 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.979991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.980205 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.980060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:17.980205 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:17.980099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081035 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2g6g\" (UniqueName: \"kubernetes.io/projected/1f594c02-5b77-4cb8-8cf9-b408800895d0-kube-api-access-b2g6g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f594c02-5b77-4cb8-8cf9-b408800895d0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081226 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081632 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081699 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.081699 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.081666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.083267 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.083237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f594c02-5b77-4cb8-8cf9-b408800895d0-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.083384 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.083332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f594c02-5b77-4cb8-8cf9-b408800895d0-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.088180 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.088157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2g6g\" (UniqueName: \"kubernetes.io/projected/1f594c02-5b77-4cb8-8cf9-b408800895d0-kube-api-access-b2g6g\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk\" (UID: \"1f594c02-5b77-4cb8-8cf9-b408800895d0\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.219774 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.219732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:18.346947 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.346918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk"] Apr 21 02:54:18.349787 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:54:18.349751 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f594c02_5b77_4cb8_8cf9_b408800895d0.slice/crio-8e0ee04dd7defa2b747659a4a603d8ab002f9821a3ed47a77dc9a85acf94f35b WatchSource:0}: Error finding container 8e0ee04dd7defa2b747659a4a603d8ab002f9821a3ed47a77dc9a85acf94f35b: Status 404 returned error can't find the container with id 8e0ee04dd7defa2b747659a4a603d8ab002f9821a3ed47a77dc9a85acf94f35b Apr 21 02:54:18.439194 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.439155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" event={"ID":"1f594c02-5b77-4cb8-8cf9-b408800895d0","Type":"ContainerStarted","Data":"61109a7fc2f5cf9a65563b2cdf614c91f6be27b8e22e1f9874621f356c21450a"} Apr 21 02:54:18.439194 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:18.439197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" event={"ID":"1f594c02-5b77-4cb8-8cf9-b408800895d0","Type":"ContainerStarted","Data":"8e0ee04dd7defa2b747659a4a603d8ab002f9821a3ed47a77dc9a85acf94f35b"} Apr 21 02:54:19.169613 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.169570 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h"] Apr 21 02:54:19.172849 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.172824 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.175364 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.175338 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 02:54:19.183719 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.183694 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h"] Apr 21 02:54:19.186730 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.186708 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:54:19.291588 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.291588 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.291844 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.291844 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jq6r\" (UniqueName: \"kubernetes.io/projected/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kube-api-access-8jq6r\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.291844 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.291994 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.291845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393307 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jq6r\" (UniqueName: \"kubernetes.io/projected/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kube-api-access-8jq6r\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393456 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393319 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393456 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393456 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393649 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393649 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393854 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393916 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.393916 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.393904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.395567 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.395537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.395939 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.395917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.400281 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.400254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jq6r\" (UniqueName: \"kubernetes.io/projected/dcf3cbc8-897b-4ba5-a20e-1bac17f6c472-kube-api-access-8jq6r\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-9bd4h\" (UID: \"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.484300 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.484258 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:19.644651 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:19.644602 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h"] Apr 21 02:54:20.448175 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:20.448133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" event={"ID":"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472","Type":"ContainerStarted","Data":"3f4e6c4048b0364b4ffe2f4d3487c4030c455829e3434da31fc1bed6ef7da8d5"} Apr 21 02:54:20.448565 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:20.448182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" event={"ID":"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472","Type":"ContainerStarted","Data":"ac5daa9694c36cf625bafe94615ffad1045518e8d46c1944747ff306782382f1"} Apr 21 02:54:22.479810 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:22.479772 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:54:24.463408 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:24.463372 2572 generic.go:358] "Generic (PLEG): container finished" podID="1f594c02-5b77-4cb8-8cf9-b408800895d0" containerID="61109a7fc2f5cf9a65563b2cdf614c91f6be27b8e22e1f9874621f356c21450a" exitCode=0 Apr 21 02:54:24.463880 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:24.463446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" event={"ID":"1f594c02-5b77-4cb8-8cf9-b408800895d0","Type":"ContainerDied","Data":"61109a7fc2f5cf9a65563b2cdf614c91f6be27b8e22e1f9874621f356c21450a"} Apr 21 02:54:25.468505 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:25.468463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" event={"ID":"1f594c02-5b77-4cb8-8cf9-b408800895d0","Type":"ContainerStarted","Data":"074b795acdfcf0bd3985c448004fa0af23a8cfad9d007e3a608adfe464ae6061"} Apr 21 02:54:25.469173 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:25.468706 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:25.469896 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:25.469872 2572 generic.go:358] "Generic (PLEG): container finished" podID="dcf3cbc8-897b-4ba5-a20e-1bac17f6c472" containerID="3f4e6c4048b0364b4ffe2f4d3487c4030c455829e3434da31fc1bed6ef7da8d5" exitCode=0 Apr 21 02:54:25.470004 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:25.469945 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" event={"ID":"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472","Type":"ContainerDied","Data":"3f4e6c4048b0364b4ffe2f4d3487c4030c455829e3434da31fc1bed6ef7da8d5"} Apr 21 02:54:25.486025 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:25.485978 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" podStartSLOduration=8.293016183 podStartE2EDuration="8.485964053s" podCreationTimestamp="2026-04-21 02:54:17 +0000 UTC" firstStartedPulling="2026-04-21 02:54:24.464184506 +0000 UTC m=+832.148728745" lastFinishedPulling="2026-04-21 02:54:24.657132374 +0000 UTC m=+832.341676615" observedRunningTime="2026-04-21 02:54:25.484077468 +0000 UTC m=+833.168621729" watchObservedRunningTime="2026-04-21 02:54:25.485964053 +0000 UTC m=+833.170508312" Apr 21 02:54:26.475058 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:26.475021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" event={"ID":"dcf3cbc8-897b-4ba5-a20e-1bac17f6c472","Type":"ContainerStarted","Data":"3edc7e6908a7e67e0e18f63c873babb0cf70eed53ca3c0c7c1bb735fee072fff"} Apr 21 02:54:26.475442 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:26.475373 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:26.493242 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:26.493188 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" podStartSLOduration=7.301175021 podStartE2EDuration="7.493173352s" podCreationTimestamp="2026-04-21 02:54:19 +0000 UTC" firstStartedPulling="2026-04-21 02:54:25.470469675 +0000 UTC m=+833.155013916" lastFinishedPulling="2026-04-21 02:54:25.662468009 +0000 UTC m=+833.347012247" observedRunningTime="2026-04-21 02:54:26.491041037 +0000 UTC m=+834.175585296" watchObservedRunningTime="2026-04-21 02:54:26.493173352 +0000 UTC m=+834.177717665" Apr 21 02:54:36.488356 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:36.488322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk" Apr 21 02:54:37.491643 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:37.491613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-9bd4h" Apr 21 02:54:38.882726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:54:38.882690 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2bmfc"] Apr 21 02:55:43.605591 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:43.605541 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2cqdx_6151b4cb-ae22-4b41-ab45-86e3ae63e7f4/manager/0.log" Apr 21 02:55:43.968438 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:43.968390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gmv48_845205e9-4ed6-439b-9e1e-f2f66fecc8f3/manager/2.log" Apr 21 02:55:44.188927 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:44.188881 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-tcbdh_63f53f70-1b73-4784-89ca-6b4c169a1521/manager/0.log" Apr 21 02:55:46.013955 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:46.013923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-96b6f_d1e291af-d9ee-49ef-895b-be5a6e705eaf/manager/0.log" Apr 21 02:55:46.474778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:46.474748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-2bmfc_ee158c03-fdd4-4786-add4-1a851173ca55/limitador/0.log" Apr 21 02:55:47.055726 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:47.055691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-cclxf_861f6e92-414d-4fcb-ad9f-07d80cc5e5d2/discovery/0.log" Apr 21 02:55:47.263055 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:47.263023 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-55fc66fcf7-wdh67_838a4b2c-71ad-43cc-ad01-797dad3a3ecf/kube-auth-proxy/0.log" Apr 21 02:55:47.367496 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:47.367398 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-m5n8z_0918a43b-f702-44ce-ad65-0d11bcadfd54/istio-proxy/0.log" Apr 21 02:55:48.040146 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.040112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-9bd4h_dcf3cbc8-897b-4ba5-a20e-1bac17f6c472/storage-initializer/0.log" Apr 21 02:55:48.047946 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.047918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-9bd4h_dcf3cbc8-897b-4ba5-a20e-1bac17f6c472/main/0.log" Apr 21 02:55:48.158136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.158104 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk_1f594c02-5b77-4cb8-8cf9-b408800895d0/storage-initializer/0.log" Apr 21 02:55:48.165331 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.165308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccq5kxk_1f594c02-5b77-4cb8-8cf9-b408800895d0/main/0.log" Apr 21 02:55:48.273637 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.273611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9_e337440f-acdb-426c-9a66-317ec3843ff5/main/0.log" Apr 21 02:55:48.279960 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:48.279941 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-g4cj9_e337440f-acdb-426c-9a66-317ec3843ff5/storage-initializer/0.log" Apr 21 02:55:54.959573 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:54.959539 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xlxwj_7cc43e1f-6f61-404a-ad72-d62ed23cea64/global-pull-secret-syncer/0.log" Apr 21 02:55:55.029552 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:55.029509 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-69dnr_60d6f338-9195-4f2e-ab8b-d1a92cd1fc22/konnectivity-agent/0.log" Apr 21 02:55:55.125016 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:55.124987 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-147.ec2.internal_be812a68c721006fc84b840bb8d76277/haproxy/0.log" Apr 21 02:55:59.280368 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:59.280340 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-96b6f_d1e291af-d9ee-49ef-895b-be5a6e705eaf/manager/0.log" Apr 21 02:55:59.415247 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:55:59.415211 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-2bmfc_ee158c03-fdd4-4786-add4-1a851173ca55/limitador/0.log" Apr 21 02:56:01.425078 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:01.425040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bl4vc_1d063354-c5a3-4ad2-a124-25953ad1623e/node-exporter/0.log" Apr 21 02:56:01.460344 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:01.460322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bl4vc_1d063354-c5a3-4ad2-a124-25953ad1623e/kube-rbac-proxy/0.log" Apr 21 02:56:01.482652 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:01.482632 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bl4vc_1d063354-c5a3-4ad2-a124-25953ad1623e/init-textfile/0.log" Apr 21 02:56:03.575462 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.575430 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr"] Apr 21 02:56:03.578759 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.578738 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.580952 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.580936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"openshift-service-ca.crt\"" Apr 21 02:56:03.581625 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.581599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjwp\"/\"kube-root-ca.crt\"" Apr 21 02:56:03.581718 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.581626 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjwp\"/\"default-dockercfg-rx8jc\"" Apr 21 02:56:03.585291 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.585268 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr"] Apr 21 02:56:03.690328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.690297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-lib-modules\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.690328 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.690336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-podres\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.690543 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.690371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57tlm\" (UniqueName: \"kubernetes.io/projected/1bd6d37e-f307-4431-8a44-c7d83798da11-kube-api-access-57tlm\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.690543 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.690388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-sys\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.690543 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.690401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-proc\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790796 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57tlm\" (UniqueName: \"kubernetes.io/projected/1bd6d37e-f307-4431-8a44-c7d83798da11-kube-api-access-57tlm\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-sys\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-proc\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-lib-modules\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-podres\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.790908 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-sys\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.791120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-proc\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.791120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-podres\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.791120 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.790992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6d37e-f307-4431-8a44-c7d83798da11-lib-modules\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.798196 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.798165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57tlm\" (UniqueName: \"kubernetes.io/projected/1bd6d37e-f307-4431-8a44-c7d83798da11-kube-api-access-57tlm\") pod \"perf-node-gather-daemonset-62qxr\" (UID: \"1bd6d37e-f307-4431-8a44-c7d83798da11\") " pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:03.889709 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:03.889637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:04.214563 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:04.214506 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr"] Apr 21 02:56:04.216893 ip-10-0-137-147 kubenswrapper[2572]: W0421 02:56:04.216865 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1bd6d37e_f307_4431_8a44_c7d83798da11.slice/crio-285f31c89344bfaae94465d38d8a51a848392e86beb8bb4ad82bdc4fb80dcbd3 WatchSource:0}: Error finding container 285f31c89344bfaae94465d38d8a51a848392e86beb8bb4ad82bdc4fb80dcbd3: Status 404 returned error can't find the container with id 285f31c89344bfaae94465d38d8a51a848392e86beb8bb4ad82bdc4fb80dcbd3 Apr 21 02:56:04.799734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:04.799697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" event={"ID":"1bd6d37e-f307-4431-8a44-c7d83798da11","Type":"ContainerStarted","Data":"e1fe413a800abe06d66b6a5802064d5ec1f27830273e694a58b5340be698048e"} Apr 21 02:56:04.799734 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:04.799737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" event={"ID":"1bd6d37e-f307-4431-8a44-c7d83798da11","Type":"ContainerStarted","Data":"285f31c89344bfaae94465d38d8a51a848392e86beb8bb4ad82bdc4fb80dcbd3"} Apr 21 02:56:04.800291 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:04.799808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:04.814361 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:04.814315 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" podStartSLOduration=1.814300737 podStartE2EDuration="1.814300737s" podCreationTimestamp="2026-04-21 02:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:56:04.813634726 +0000 UTC m=+932.498178986" watchObservedRunningTime="2026-04-21 02:56:04.814300737 +0000 UTC m=+932.498844996" Apr 21 02:56:05.573632 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:05.573596 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4c7td_e8ab2f22-c931-49de-80fd-45193fa7eda9/dns/0.log" Apr 21 02:56:05.594081 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:05.594056 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4c7td_e8ab2f22-c931-49de-80fd-45193fa7eda9/kube-rbac-proxy/0.log" Apr 21 02:56:05.703308 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:05.703281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9mbn4_a9bda1dd-f3d4-41e7-9167-d144e08a951c/dns-node-resolver/0.log" Apr 21 02:56:06.254955 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:06.254927 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wwdwn_d62dffc6-07e2-43c5-929f-e5547bc6cbb9/node-ca/0.log" Apr 21 02:56:07.153655 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:07.153627 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-cclxf_861f6e92-414d-4fcb-ad9f-07d80cc5e5d2/discovery/0.log" Apr 21 02:56:07.199130 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:07.199103 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-55fc66fcf7-wdh67_838a4b2c-71ad-43cc-ad01-797dad3a3ecf/kube-auth-proxy/0.log" Apr 21 02:56:07.225630 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:07.225604 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-m5n8z_0918a43b-f702-44ce-ad65-0d11bcadfd54/istio-proxy/0.log" Apr 21 02:56:07.777933 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:07.777907 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zn5hz_4bdbc6e8-8ad6-4643-b8ca-75ba5c93e85a/serve-healthcheck-canary/0.log" Apr 21 02:56:08.358250 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:08.358222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9khlr_04ae7926-9039-48f6-910e-3c6deeb48e8a/kube-rbac-proxy/0.log" Apr 21 02:56:08.383297 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:08.383274 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9khlr_04ae7926-9039-48f6-910e-3c6deeb48e8a/exporter/0.log" Apr 21 02:56:08.403216 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:08.403187 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9khlr_04ae7926-9039-48f6-910e-3c6deeb48e8a/extractor/0.log" Apr 21 02:56:10.295250 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:10.295222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2cqdx_6151b4cb-ae22-4b41-ab45-86e3ae63e7f4/manager/0.log" Apr 21 02:56:10.362778 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:10.362748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gmv48_845205e9-4ed6-439b-9e1e-f2f66fecc8f3/manager/1.log" Apr 21 02:56:10.371783 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:10.371762 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-gmv48_845205e9-4ed6-439b-9e1e-f2f66fecc8f3/manager/2.log" Apr 21 02:56:10.435768 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:10.435735 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f4d6bff-tcbdh_63f53f70-1b73-4784-89ca-6b4c169a1521/manager/0.log" Apr 21 02:56:10.811967 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:10.811940 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjwp/perf-node-gather-daemonset-62qxr" Apr 21 02:56:17.598136 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:17.598098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-48qgw_4c53251f-0bae-438f-82b6-956e50adc4eb/kube-multus/0.log" Apr 21 02:56:17.935976 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:17.935895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/kube-multus-additional-cni-plugins/0.log" Apr 21 02:56:17.955119 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:17.955092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/egress-router-binary-copy/0.log" Apr 21 02:56:17.975782 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:17.975758 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/cni-plugins/0.log" Apr 21 02:56:17.999090 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:17.999061 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/bond-cni-plugin/0.log" Apr 21 02:56:18.018064 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:18.018039 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/routeoverride-cni/0.log" Apr 21 02:56:18.050221 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:18.050197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/whereabouts-cni-bincopy/0.log" Apr 21 02:56:18.069940 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:18.069918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kqfw8_773d483d-dfc3-4e6e-b1fa-f8da910c09d0/whereabouts-cni/0.log" Apr 21 02:56:18.237254 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:18.237231 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bzdk8_c2a4d15a-56b4-43a2-b85f-305025a28b5e/network-metrics-daemon/0.log" Apr 21 02:56:18.254679 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:18.254658 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bzdk8_c2a4d15a-56b4-43a2-b85f-305025a28b5e/kube-rbac-proxy/0.log" Apr 21 02:56:19.366859 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.366830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/ovn-controller/0.log" Apr 21 02:56:19.388775 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.388731 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/ovn-acl-logging/0.log" Apr 21 02:56:19.410103 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.410073 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/kube-rbac-proxy-node/0.log" Apr 21 02:56:19.429777 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.429748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:56:19.448436 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.448407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/northd/0.log" Apr 21 02:56:19.467331 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.467311 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/nbdb/0.log" Apr 21 02:56:19.486512 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.486493 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/sbdb/0.log" Apr 21 02:56:19.572930 ip-10-0-137-147 kubenswrapper[2572]: I0421 02:56:19.572900 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cpdzb_0896d03a-bffd-41a6-83ef-fae8f7e239a7/ovnkube-controller/0.log"