Apr 16 18:10:12.490001 ip-10-0-143-48 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:10:13.040293 ip-10-0-143-48 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:13.040293 ip-10-0-143-48 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:10:13.040293 ip-10-0-143-48 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:13.040293 ip-10-0-143-48 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:10:13.040293 ip-10-0-143-48 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:10:13.041457 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.041368 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:10:13.051546 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051525 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:13.051546 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051543 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051549 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051554 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051557 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051560 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051563 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051566 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051569 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051572 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051574 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051577 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051579 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051582 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051585 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051588 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051591 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051593 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051596 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051600 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:13.051619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051603 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051605 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051608 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051611 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051613 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051616 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051619 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051622 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051626 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051629 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051631 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051634 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051637 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051640 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051642 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051645 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051649 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051652 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051656 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:13.052121 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051658 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051661 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051664 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051666 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051669 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051671 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051674 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051676 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051679 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051682 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051685 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051688 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051691 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051705 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051710 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051714 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051718 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051723 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051726 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051729 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:13.052595 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051732 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051735 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051737 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051740 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051743 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051746 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051748 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051751 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051753 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051756 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051758 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051762 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051764 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051767 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051769 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051772 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051774 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051777 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051780 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051782 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:13.053138 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051785 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051788 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051790 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051795 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051798 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051801 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.051804 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052238 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052244 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052247 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052249 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052252 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052255 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052258 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052260 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052262 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052265 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052268 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052270 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052274 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:13.053619 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052276 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052279 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052282 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052284 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052287 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052290 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052292 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052295 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052298 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052300 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052303 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052306 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052308 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052311 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052314 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052316 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052319 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052321 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052324 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052328 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:13.054125 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052331 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052333 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052336 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052338 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052341 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052344 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052347 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052349 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052352 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052355 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052358 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052362 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052365 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052368 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052371 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052374 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052377 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052379 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052381 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:13.054651 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052384 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052386 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052389 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052392 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052395 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052397 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052400 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052402 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052405 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052407 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052410 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052412 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052417 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052421 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052424 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052427 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052430 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052433 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052435 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:13.055127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052439 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052441 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052444 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052447 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052449 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052451 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052454 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052458 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052461 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052464 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052467 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052469 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052472 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052474 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.052476 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053422 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053433 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053440 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053445 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053450 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053454 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:10:13.055596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053459 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053463 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053466 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053470 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053473 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053477 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053481 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053484 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053487 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053502 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053508 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053512 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053516 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053521 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053525 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053528 2574 flags.go:64] FLAG: --config-dir="" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053531 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053534 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053539 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053542 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053546 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053549 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053552 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053556 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:10:13.056123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053558 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053562 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053565 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053570 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053573 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053576 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053579 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053582 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053584 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053589 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053592 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053595 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053599 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053603 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053607 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053610 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053613 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053616 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053619 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053622 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053625 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053628 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053631 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053634 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053637 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 18:10:13.056717 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053641 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053645 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053648 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053652 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053655 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053658 2574 flags.go:64] FLAG: --help="false" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053661 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053664 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053667 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053670 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053674 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053678 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053681 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053684 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053687 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053690 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053707 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053711 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053714 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053717 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053720 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053723 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053727 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053730 2574 flags.go:64] FLAG: --lock-file="" Apr 16 18:10:13.057327 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053733 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053736 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053739 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053745 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053748 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053751 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053754 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053757 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053760 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053763 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053769 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053774 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053777 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053782 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053785 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053788 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053791 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053794 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053797 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053800 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053803 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053811 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053814 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053818 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:10:13.057931 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053821 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053824 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053830 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053833 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053837 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053840 2574 flags.go:64] FLAG: --port="10250" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053843 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053845 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-014b23a905f48fea3" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053849 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053852 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053855 2574 flags.go:64] FLAG: --register-node="true" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053858 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053861 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053865 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053868 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053870 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053873 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053877 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053882 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053885 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053888 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053891 2574 flags.go:64] FLAG: --runonce="false" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053894 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053898 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053901 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:10:13.058508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053910 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053915 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053918 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053921 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053925 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053928 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053931 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053934 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053937 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053940 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053943 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053946 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053951 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053954 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053957 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053962 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053965 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053968 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053971 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053974 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053977 2574 flags.go:64] FLAG: --v="2" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053982 2574 flags.go:64] FLAG: --version="false" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053986 2574 flags.go:64] FLAG: --vmodule="" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053990 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.053994 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:10:13.059133 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054102 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054105 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054108 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054111 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054115 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054118 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054124 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054127 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054130 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054133 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054136 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054139 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054142 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054144 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054147 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054150 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054153 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054155 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054158 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054161 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:13.059730 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054163 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054166 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054169 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054171 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054174 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054176 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054180 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054182 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054185 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054187 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054189 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054192 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054196 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054199 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054201 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054204 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054206 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054210 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054213 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054216 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:13.060268 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054218 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054221 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054224 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054226 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054229 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054232 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054234 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054237 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054239 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054242 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054245 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054247 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054250 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054252 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054255 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054258 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054260 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054263 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054265 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:13.060849 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054268 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054270 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054273 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054275 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054278 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054282 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054284 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054287 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054289 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054292 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054294 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054298 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054300 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054303 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054306 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054310 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054314 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054316 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054319 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054322 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:13.061316 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054325 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054327 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054330 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054332 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054335 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054339 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.054342 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:13.061871 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.055673 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:13.062547 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.062526 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:10:13.062576 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.062548 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:10:13.062606 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062597 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:13.062606 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062602 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:13.062606 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062606 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062610 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062613 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062616 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062619 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062621 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062624 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062627 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062630 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062632 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062635 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062638 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062641 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062644 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062646 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062649 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062651 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062654 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062656 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:13.062686 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062659 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062662 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062664 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062667 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062670 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062672 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062675 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062678 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062680 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062683 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062687 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062690 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062707 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062711 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062714 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062717 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062719 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062722 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062725 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062727 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:13.063160 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062730 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062732 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062735 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062738 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062740 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062743 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062746 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062748 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062751 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062753 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062756 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062758 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062761 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062764 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062767 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062770 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062772 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062775 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062778 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:13.063646 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062783 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062788 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062793 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062796 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062800 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062803 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062806 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062808 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062811 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062814 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062816 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062819 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062821 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062824 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062827 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062829 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062832 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062835 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062837 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:13.064118 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062840 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062842 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062845 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062847 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062850 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062853 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062856 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.062862 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062959 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062964 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062967 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062970 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062973 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062976 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062979 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:10:13.064615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062981 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062984 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062986 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062989 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062992 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062995 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.062997 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063000 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063002 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063005 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063007 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063010 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063013 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063015 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063018 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063020 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063023 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063026 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063029 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063031 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:10:13.065001 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063034 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063036 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063039 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063041 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063044 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063047 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063050 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063052 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063054 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063057 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063059 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063062 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063065 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063067 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063070 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063072 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063075 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063077 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063080 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063082 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:10:13.065492 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063085 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063088 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063092 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063095 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063098 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063101 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063104 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063107 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063110 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063113 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063116 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063118 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063121 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063125 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063129 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063132 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063136 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063138 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063141 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:10:13.066022 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063144 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063146 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063149 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063152 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063155 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063157 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063160 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063162 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063165 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063168 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063171 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063174 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063176 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063179 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063182 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063184 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063187 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063189 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063192 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:10:13.066487 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:13.063194 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.063199 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.063314 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.065520 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.066440 2574 server.go:1019] "Starting client certificate rotation" Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.066531 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:13.066964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.066563 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:10:13.098599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.098575 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:13.104513 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.104407 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:10:13.128035 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.128004 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:10:13.133859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.133836 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:13.134000 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.133939 2574 log.go:25] "Validated CRI v1 image API" Apr 16 18:10:13.136430 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.136407 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:10:13.139373 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.139349 2574 fs.go:135] Filesystem UUIDs: map[34c82192-3ded-46c9-97ee-03ca1c6187e6:/dev/nvme0n1p4 6c30bf2e-bcd1-445f-a818-0c938393abb8:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:10:13.139468 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.139371 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:10:13.144772 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.144629 2574 manager.go:217] Machine: {Timestamp:2026-04-16 18:10:13.143179823 +0000 UTC m=+0.506301112 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100417 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27aa09c6a9e36a757a3415847d784f SystemUUID:ec27aa09-c6a9-e36a-757a-3415847d784f BootID:d7f4e04b-48bd-4b11-8400-2f070756958a Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:48:76:60:07:0b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:48:76:60:07:0b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:4c:06:d2:cd:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:10:13.144772 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.144760 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:10:13.144952 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.144873 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:10:13.148087 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148057 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:10:13.148259 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148089 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-48.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:10:13.148351 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148274 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:10:13.148351 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148287 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:10:13.148351 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148305 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:13.148351 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.148333 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:10:13.150104 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.150090 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:13.150241 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.150230 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:10:13.153318 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.153307 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:10:13.154383 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.154370 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:10:13.155215 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.155204 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:10:13.155271 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.155227 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:10:13.155271 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.155241 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:10:13.156347 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.156333 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:13.156428 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.156357 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:10:13.159821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.159805 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:10:13.161669 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.161655 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:10:13.162937 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162919 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162949 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162961 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162968 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162976 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162983 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.162993 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:10:13.163006 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.163007 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:10:13.163307 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.163019 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:10:13.163307 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.163027 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:10:13.163307 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.163040 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:10:13.163307 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.163052 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:10:13.164066 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.164053 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:10:13.164098 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.164067 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:10:13.165093 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.165071 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:10:13.169238 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.169199 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:10:13.170680 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.170649 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:10:13.170773 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.170727 2574 server.go:1295] "Started kubelet" Apr 16 18:10:13.170882 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.170820 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:10:13.170924 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.170902 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:10:13.171184 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.170826 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:10:13.171638 ip-10-0-143-48 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:10:13.172039 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.172018 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:10:13.173926 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.173903 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:10:13.180345 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.180326 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:10:13.180513 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.180337 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:13.180580 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.180362 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dmlln" Apr 16 18:10:13.181294 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181276 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:10:13.181294 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181278 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:10:13.181406 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181305 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:10:13.181464 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181416 2574 factory.go:55] Registering systemd factory Apr 16 18:10:13.181464 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181431 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:10:13.181464 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181441 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:10:13.181464 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181449 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:10:13.181720 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.181668 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.181824 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181738 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-48.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:10:13.181824 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181781 2574 factory.go:153] Registering CRI-O factory Apr 16 18:10:13.181824 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181794 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 18:10:13.181992 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181853 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:10:13.181992 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181876 2574 factory.go:103] Registering Raw factory Apr 16 18:10:13.181992 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.181889 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 18:10:13.182321 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.182298 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:10:13.182395 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.182326 2574 manager.go:319] Starting recovery of all containers Apr 16 18:10:13.183326 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.181826 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-48.ec2.internal.18a6e8bfb82e52de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-48.ec2.internal,UID:ip-10-0-143-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-48.ec2.internal,},FirstTimestamp:2026-04-16 18:10:13.170680542 +0000 UTC m=+0.533801834,LastTimestamp:2026-04-16 18:10:13.170680542 +0000 UTC m=+0.533801834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-48.ec2.internal,}" Apr 16 18:10:13.186967 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.186855 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dmlln" Apr 16 18:10:13.188220 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.188188 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:10:13.191538 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.191513 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:13.192313 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.192292 2574 manager.go:324] Recovery completed Apr 16 18:10:13.193912 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.193889 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-48.ec2.internal\" not found" node="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.196792 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.196775 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.199248 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199233 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.199305 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199261 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.199305 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199270 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.199748 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199733 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:10:13.199748 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199747 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:10:13.199862 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.199782 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:10:13.202849 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.202837 2574 policy_none.go:49] "None policy: Start" Apr 16 18:10:13.202888 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.202852 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:10:13.202888 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.202862 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248172 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.248216 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248230 2574 server.go:85] "Starting device plugin registration server" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248511 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248528 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248624 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248687 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.248707 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.249359 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:10:13.268302 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.249393 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.331128 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.331049 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:10:13.332218 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.332200 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:10:13.332313 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.332233 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:10:13.332313 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.332243 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:10:13.332313 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.332285 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:10:13.336338 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.336321 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:13.348765 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.348751 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.349794 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.349775 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.349866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.349811 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.349866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.349821 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.349866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.349846 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.358385 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.358365 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.358454 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.358388 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-48.ec2.internal\": node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.380352 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.380325 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.432613 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.432580 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal"] Apr 16 18:10:13.432731 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.432661 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.433650 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.433634 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.433738 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.433665 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.433738 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.433680 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.436036 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436022 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.436193 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.436230 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436207 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.436780 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436757 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.436780 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436767 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.436780 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436782 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.436943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436787 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.436943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436795 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.436943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.436796 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.439314 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.439294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.439412 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.439323 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:13.439938 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.439924 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:13.440011 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.439951 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:13.440011 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.439966 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:13.453287 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.453267 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-48.ec2.internal\" not found" node="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.457574 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.457558 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-48.ec2.internal\" not found" node="ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.481157 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.481127 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.483039 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.483020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.483095 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.483047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.483095 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.483068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/022134da6e351443de7e25a90a6de7a4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-48.ec2.internal\" (UID: \"022134da6e351443de7e25a90a6de7a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.582210 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.582128 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.583275 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.583334 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/022134da6e351443de7e25a90a6de7a4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-48.ec2.internal\" (UID: \"022134da6e351443de7e25a90a6de7a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.583334 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.583427 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.583427 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/022134da6e351443de7e25a90a6de7a4-config\") pod \"kube-apiserver-proxy-ip-10-0-143-48.ec2.internal\" (UID: \"022134da6e351443de7e25a90a6de7a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.583427 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.583377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48a65de737f127de1741ec3d41512a5f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal\" (UID: \"48a65de737f127de1741ec3d41512a5f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.682833 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.682794 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.757104 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.757077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.760576 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:13.760561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:13.783642 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.783607 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.884118 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.884028 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:13.984512 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:13.984478 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.066782 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.066755 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:10:14.067338 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.066924 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:14.067338 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.066940 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:14.085100 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.085071 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.180905 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.180817 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:14.185591 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.185563 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.189463 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.189431 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:05:13 +0000 UTC" deadline="2027-12-28 18:53:37.820737582 +0000 UTC" Apr 16 18:10:14.189558 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.189463 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14904h43m23.631278957s" Apr 16 18:10:14.204406 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.204381 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:14.227894 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.227870 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sqcdf" Apr 16 18:10:14.237007 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.236986 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sqcdf" Apr 16 18:10:14.286262 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.286229 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.386951 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.386918 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.418127 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:14.418084 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022134da6e351443de7e25a90a6de7a4.slice/crio-0895bb4d7666ffae47eb5e84172515ec9da0c9fe25e2d566a8a00e4b08ccc0b4 WatchSource:0}: Error finding container 0895bb4d7666ffae47eb5e84172515ec9da0c9fe25e2d566a8a00e4b08ccc0b4: Status 404 returned error can't find the container with id 0895bb4d7666ffae47eb5e84172515ec9da0c9fe25e2d566a8a00e4b08ccc0b4 Apr 16 18:10:14.418390 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:14.418364 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a65de737f127de1741ec3d41512a5f.slice/crio-06e8d626edb234a16f68c267d020ff9e8086de648514e7a429c11821ba77aab4 WatchSource:0}: Error finding container 06e8d626edb234a16f68c267d020ff9e8086de648514e7a429c11821ba77aab4: Status 404 returned error can't find the container with id 06e8d626edb234a16f68c267d020ff9e8086de648514e7a429c11821ba77aab4 Apr 16 18:10:14.422374 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.422350 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:14.448419 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.448361 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:14.487189 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.487160 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.587558 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.587525 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.688267 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:14.688228 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-48.ec2.internal\" not found" Apr 16 18:10:14.765466 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.765401 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:14.780874 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.780839 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" Apr 16 18:10:14.793788 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.793756 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:14.795084 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.795055 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" Apr 16 18:10:14.801062 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:14.801041 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:15.156372 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.156341 2574 apiserver.go:52] "Watching apiserver" Apr 16 18:10:15.163053 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.162916 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:10:15.163415 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.163385 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal","openshift-dns/node-resolver-sqdqp","openshift-image-registry/node-ca-4dfv8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal","openshift-multus/multus-jp8j9","openshift-ovn-kubernetes/ovnkube-node-mbm6k","kube-system/global-pull-secret-syncer-56w6g","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp","openshift-cluster-node-tuning-operator/tuned-8tjw6","openshift-multus/multus-additional-cni-plugins-dg7bs","openshift-multus/network-metrics-daemon-7gnqt","openshift-network-diagnostics/network-check-target-9jtfh","openshift-network-operator/iptables-alerter-t6q6l","kube-system/konnectivity-agent-m6xb7"] Apr 16 18:10:15.166282 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.166259 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.168432 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.168409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.170055 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170033 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:10:15.170157 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.170278 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170253 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.170349 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170300 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:10:15.170728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170420 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h459l\"" Apr 16 18:10:15.170728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.170728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7rjld\"" Apr 16 18:10:15.170728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:10:15.171147 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170876 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:10:15.171147 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.170917 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.173005 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.172918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.174665 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.174601 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rppnt\"" Apr 16 18:10:15.174864 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.174842 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.175062 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.175042 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:10:15.175123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.175063 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.175233 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.175213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.175375 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.175356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.177017 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.176977 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:10:15.177017 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.177013 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x7chk\"" Apr 16 18:10:15.177207 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.177056 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gtcbv\"" Apr 16 18:10:15.177207 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.177066 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:10:15.177207 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.177087 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:10:15.178084 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.178061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.179852 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.179831 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:10:15.180008 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.179893 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:10:15.180152 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.179921 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.180561 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.180537 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.180810 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.180791 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:10:15.180895 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.180795 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rdpj2\"" Apr 16 18:10:15.181469 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.181365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:10:15.183782 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.183677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.185316 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.185300 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.185637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.185616 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.185761 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.185717 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gkmbz\"" Apr 16 18:10:15.186018 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.185996 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.186113 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.186084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.186113 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.186082 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:15.187984 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.187805 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.187984 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.187827 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2phrw\"" Apr 16 18:10:15.187984 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.187861 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.188321 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.188306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.190001 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.189932 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:10:15.190108 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a3d42b4-5542-4575-a00e-81713e0d9ced-iptables-alerter-script\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.190108 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-host\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.190213 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-serviceca\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.190213 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-slxgm\"" Apr 16 18:10:15.190213 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190132 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:10:15.190213 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-kubelet\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190213 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-log-socket\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-config\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05edfe7d-7845-4f05-a320-9d359462ba01-agent-certs\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.190445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.190445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-kubelet\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.190445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-run\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-host\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-tmp\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-dbus\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xwj\" (UniqueName: \"kubernetes.io/projected/4454b177-644b-4125-8259-d9aeaf036cf6-kube-api-access-t7xwj\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-netns\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-var-lib-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.190737 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-ovn\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-env-overrides\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-netns\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-bin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190943 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-tuned\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.190967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-system-cni-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-os-release\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-conf-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a3d42b4-5542-4575-a00e-81713e0d9ced-host-slash\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-node-log\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-bin\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.191265 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-socket-dir-parent\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191277 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfsq\" (UniqueName: \"kubernetes.io/projected/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-kube-api-access-pnfsq\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-multus\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-etc-kubernetes\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-conf\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-sys\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-systemd-units\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-slash\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191453 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-netd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-cnibin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-daemon-config\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysconfig\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-os-release\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191638 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-kubelet-config\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.192009 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.191782 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191799 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-multus-certs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/9a3d42b4-5542-4575-a00e-81713e0d9ced-kube-api-access-vmlgq\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cnibin\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-cni-binary-copy\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-etc-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.191980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-hostroot\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-var-lib-kubelet\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jgz\" (UniqueName: \"kubernetes.io/projected/a31dbcfe-5679-4c94-9030-4e1442f23cf0-kube-api-access-l2jgz\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-script-lib\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mjs\" (UniqueName: \"kubernetes.io/projected/d01511ca-4447-40f9-8518-9d2f62898c7a-kube-api-access-68mjs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-systemd\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-lib-modules\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-systemd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.192859 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4454b177-644b-4125-8259-d9aeaf036cf6-ovn-node-metrics-cert\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97fnl\" (UniqueName: \"kubernetes.io/projected/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-kube-api-access-97fnl\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05edfe7d-7845-4f05-a320-9d359462ba01-konnectivity-ca\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192324 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-k8s-cni-cncf-io\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-modprobe-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-system-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.192432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-kubernetes\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.193595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.193154 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:10:15.194195 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.194169 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:15.194296 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.194240 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:15.210234 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.209916 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:15.238789 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.238753 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:14 +0000 UTC" deadline="2027-10-07 09:06:44.674506772 +0000 UTC" Apr 16 18:10:15.238789 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.238786 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12926h56m29.435724218s" Apr 16 18:10:15.282450 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.282418 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:10:15.293339 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xwj\" (UniqueName: \"kubernetes.io/projected/4454b177-644b-4125-8259-d9aeaf036cf6-kube-api-access-t7xwj\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5806a3c4-be78-414a-9251-725dc0b94d51-hosts-file\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-netns\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-var-lib-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-ovn\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-env-overrides\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-var-lib-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-netns\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-netns\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293577 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-ovn\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-bin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-netns\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-bin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-tuned\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-system-cni-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-system-cni-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-etc-selinux\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-os-release\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-conf-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-device-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.293900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a3d42b4-5542-4575-a00e-81713e0d9ced-host-slash\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-os-release\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-node-log\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293969 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.293992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-node-log\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.294025 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a3d42b4-5542-4575-a00e-81713e0d9ced-host-slash\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-conf-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.294164 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.794117821 +0000 UTC m=+3.157239109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-env-overrides\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-bin\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-bin\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-socket-dir-parent\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-socket-dir-parent\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.294778 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfsq\" (UniqueName: \"kubernetes.io/projected/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-kube-api-access-pnfsq\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-multus\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-etc-kubernetes\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-conf\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-etc-kubernetes\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-cni-multus\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-sys\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq57m\" (UniqueName: \"kubernetes.io/projected/5806a3c4-be78-414a-9251-725dc0b94d51-kube-api-access-sq57m\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-systemd-units\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-sys\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-conf\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-slash\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-slash\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-systemd-units\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-netd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294812 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-cnibin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-daemon-config\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.295559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysconfig\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-cni-netd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysconfig\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-cnibin\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-os-release\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-os-release\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.294995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-kubelet-config\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsv66\" (UniqueName: \"kubernetes.io/projected/95b4efe3-7d49-40fc-a8e3-4381e92ed949-kube-api-access-wsv66\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-multus-certs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/9a3d42b4-5542-4575-a00e-81713e0d9ced-kube-api-access-vmlgq\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cnibin\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-cni-binary-copy\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5806a3c4-be78-414a-9251-725dc0b94d51-tmp-dir\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-etc-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.296465 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-hostroot\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-var-lib-kubelet\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jgz\" (UniqueName: \"kubernetes.io/projected/a31dbcfe-5679-4c94-9030-4e1442f23cf0-kube-api-access-l2jgz\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-daemon-config\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-etc-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-var-lib-kubelet\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-openvswitch\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-script-lib\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295521 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-hostroot\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68mjs\" (UniqueName: \"kubernetes.io/projected/d01511ca-4447-40f9-8518-9d2f62898c7a-kube-api-access-68mjs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-systemd\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-lib-modules\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-kubelet-config\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-systemd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.297301 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4454b177-644b-4125-8259-d9aeaf036cf6-ovn-node-metrics-cert\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97fnl\" (UniqueName: \"kubernetes.io/projected/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-kube-api-access-97fnl\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d01511ca-4447-40f9-8518-9d2f62898c7a-cni-binary-copy\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05edfe7d-7845-4f05-a320-9d359462ba01-konnectivity-ca\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-multus-certs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-sysctl-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-k8s-cni-cncf-io\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-run-systemd\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-modprobe-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-cnibin\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.295954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-systemd\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-lib-modules\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296083 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-run-k8s-cni-cncf-io\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-registration-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.298130 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-modprobe-d\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-system-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-kubernetes\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-socket-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-sys-fs\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-system-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5b7s\" (UniqueName: \"kubernetes.io/projected/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kube-api-access-f5b7s\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-script-lib\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a3d42b4-5542-4575-a00e-81713e0d9ced-iptables-alerter-script\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-kubernetes\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-host\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-serviceca\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-kubelet\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/05edfe7d-7845-4f05-a320-9d359462ba01-konnectivity-ca\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-log-socket\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.298960 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-config\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05edfe7d-7845-4f05-a320-9d359462ba01-agent-certs\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-host-kubelet\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-host\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-serviceca\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a3d42b4-5542-4575-a00e-81713e0d9ced-iptables-alerter-script\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4454b177-644b-4125-8259-d9aeaf036cf6-log-socket\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-kubelet\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4454b177-644b-4125-8259-d9aeaf036cf6-ovnkube-config\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.296989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-run\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-host-var-lib-kubelet\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-host\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d01511ca-4447-40f9-8518-9d2f62898c7a-multus-cni-dir\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-tmp\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-run\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-dbus\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a31dbcfe-5679-4c94-9030-4e1442f23cf0-host\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.299815 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.297181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-dbus\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.300754 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.299818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-tmp\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.302042 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.301930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a31dbcfe-5679-4c94-9030-4e1442f23cf0-etc-tuned\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.302146 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.302082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4454b177-644b-4125-8259-d9aeaf036cf6-ovn-node-metrics-cert\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.302226 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.302197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/05edfe7d-7845-4f05-a320-9d359462ba01-agent-certs\") pod \"konnectivity-agent-m6xb7\" (UID: \"05edfe7d-7845-4f05-a320-9d359462ba01\") " pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.305472 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.305421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlgq\" (UniqueName: \"kubernetes.io/projected/9a3d42b4-5542-4575-a00e-81713e0d9ced-kube-api-access-vmlgq\") pod \"iptables-alerter-t6q6l\" (UID: \"9a3d42b4-5542-4575-a00e-81713e0d9ced\") " pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.305472 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.305460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97fnl\" (UniqueName: \"kubernetes.io/projected/b00af592-6ad1-4cc6-8dc6-0b46ced5a45c-kube-api-access-97fnl\") pod \"multus-additional-cni-plugins-dg7bs\" (UID: \"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c\") " pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.306441 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.306013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xwj\" (UniqueName: \"kubernetes.io/projected/4454b177-644b-4125-8259-d9aeaf036cf6-kube-api-access-t7xwj\") pod \"ovnkube-node-mbm6k\" (UID: \"4454b177-644b-4125-8259-d9aeaf036cf6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.306441 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.306120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mjs\" (UniqueName: \"kubernetes.io/projected/d01511ca-4447-40f9-8518-9d2f62898c7a-kube-api-access-68mjs\") pod \"multus-jp8j9\" (UID: \"d01511ca-4447-40f9-8518-9d2f62898c7a\") " pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.306919 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.306876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jgz\" (UniqueName: \"kubernetes.io/projected/a31dbcfe-5679-4c94-9030-4e1442f23cf0-kube-api-access-l2jgz\") pod \"tuned-8tjw6\" (UID: \"a31dbcfe-5679-4c94-9030-4e1442f23cf0\") " pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.308896 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.308832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfsq\" (UniqueName: \"kubernetes.io/projected/a22f332b-dce3-4709-8d3e-a6ca1b00bc4a-kube-api-access-pnfsq\") pod \"node-ca-4dfv8\" (UID: \"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a\") " pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.338140 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.338040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" event={"ID":"022134da6e351443de7e25a90a6de7a4","Type":"ContainerStarted","Data":"0895bb4d7666ffae47eb5e84172515ec9da0c9fe25e2d566a8a00e4b08ccc0b4"} Apr 16 18:10:15.339412 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.339387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" event={"ID":"48a65de737f127de1741ec3d41512a5f","Type":"ContainerStarted","Data":"06e8d626edb234a16f68c267d020ff9e8086de648514e7a429c11821ba77aab4"} Apr 16 18:10:15.397993 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.397921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-socket-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.397993 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.397972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-sys-fs\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5b7s\" (UniqueName: \"kubernetes.io/projected/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kube-api-access-f5b7s\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5806a3c4-be78-414a-9251-725dc0b94d51-hosts-file\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-etc-selinux\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-device-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-socket-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq57m\" (UniqueName: \"kubernetes.io/projected/5806a3c4-be78-414a-9251-725dc0b94d51-kube-api-access-sq57m\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsv66\" (UniqueName: \"kubernetes.io/projected/95b4efe3-7d49-40fc-a8e3-4381e92ed949-kube-api-access-wsv66\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.398224 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5806a3c4-be78-414a-9251-725dc0b94d51-tmp-dir\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398277 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-registration-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-registration-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.398601 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-sys-fs\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5806a3c4-be78-414a-9251-725dc0b94d51-hosts-file\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-etc-selinux\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-device-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398825 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.398873 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.398932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5806a3c4-be78-414a-9251-725dc0b94d51-tmp-dir\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.399270 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.398940 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.898920449 +0000 UTC m=+3.262041728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:15.404370 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.404342 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:15.404370 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.404370 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:15.404534 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.404386 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:15.404534 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.404441 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:15.904424608 +0000 UTC m=+3.267545895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:15.408643 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.408430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq57m\" (UniqueName: \"kubernetes.io/projected/5806a3c4-be78-414a-9251-725dc0b94d51-kube-api-access-sq57m\") pod \"node-resolver-sqdqp\" (UID: \"5806a3c4-be78-414a-9251-725dc0b94d51\") " pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.409441 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.409401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5b7s\" (UniqueName: \"kubernetes.io/projected/9dccd9fe-191d-4ac5-ab44-7a88f2d50784-kube-api-access-f5b7s\") pod \"aws-ebs-csi-driver-node-vltcp\" (UID: \"9dccd9fe-191d-4ac5-ab44-7a88f2d50784\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.409441 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.409422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsv66\" (UniqueName: \"kubernetes.io/projected/95b4efe3-7d49-40fc-a8e3-4381e92ed949-kube-api-access-wsv66\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.477869 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.477825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" Apr 16 18:10:15.487897 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.487866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t6q6l" Apr 16 18:10:15.498843 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.498554 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4dfv8" Apr 16 18:10:15.504285 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.504211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:15.513849 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.513827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jp8j9" Apr 16 18:10:15.521684 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.521662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:15.526339 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.526321 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:15.528937 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.528918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" Apr 16 18:10:15.536515 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.536494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sqdqp" Apr 16 18:10:15.542152 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.542125 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" Apr 16 18:10:15.802048 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.801961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:15.802203 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.802077 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:15.802203 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.802138 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:16.802119308 +0000 UTC m=+4.165240588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:15.902839 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:15.902805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:15.902993 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.902917 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:15.902993 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:15.902971 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:16.902957381 +0000 UTC m=+4.266078657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:16.003144 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.003108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:16.003325 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.003275 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:16.003325 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.003300 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:16.003325 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.003313 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:16.003465 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.003382 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.003361585 +0000 UTC m=+4.366482882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:16.105211 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.105178 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00af592_6ad1_4cc6_8dc6_0b46ced5a45c.slice/crio-d27b2dd7d3513e2c0486af0efd117b8fd58296bbe95cd0170d9893f35def6dc0 WatchSource:0}: Error finding container d27b2dd7d3513e2c0486af0efd117b8fd58296bbe95cd0170d9893f35def6dc0: Status 404 returned error can't find the container with id d27b2dd7d3513e2c0486af0efd117b8fd58296bbe95cd0170d9893f35def6dc0 Apr 16 18:10:16.112823 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.112799 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31dbcfe_5679_4c94_9030_4e1442f23cf0.slice/crio-bd3f1b875151dbeb80ae661f5cc99da64654a38e6a797c03c54b2310c47934b4 WatchSource:0}: Error finding container bd3f1b875151dbeb80ae661f5cc99da64654a38e6a797c03c54b2310c47934b4: Status 404 returned error can't find the container with id bd3f1b875151dbeb80ae661f5cc99da64654a38e6a797c03c54b2310c47934b4 Apr 16 18:10:16.132615 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.132453 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3d42b4_5542_4575_a00e_81713e0d9ced.slice/crio-f17ac7781f9f673b93aef780a2fb35496dd0d0d7fe3bde0f34a2e8a3640230fa WatchSource:0}: Error finding container f17ac7781f9f673b93aef780a2fb35496dd0d0d7fe3bde0f34a2e8a3640230fa: Status 404 returned error can't find the container with id f17ac7781f9f673b93aef780a2fb35496dd0d0d7fe3bde0f34a2e8a3640230fa Apr 16 18:10:16.133154 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.133115 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01511ca_4447_40f9_8518_9d2f62898c7a.slice/crio-f0f9606f13afd6263f1568884949453cb6127a7fbdf46d68d799451bd913fdfb WatchSource:0}: Error finding container f0f9606f13afd6263f1568884949453cb6127a7fbdf46d68d799451bd913fdfb: Status 404 returned error can't find the container with id f0f9606f13afd6263f1568884949453cb6127a7fbdf46d68d799451bd913fdfb Apr 16 18:10:16.134057 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.134031 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22f332b_dce3_4709_8d3e_a6ca1b00bc4a.slice/crio-67d5f2cc8b0ea4777402151b349c4eecad97b73a07547ebc840015669e4e75a3 WatchSource:0}: Error finding container 67d5f2cc8b0ea4777402151b349c4eecad97b73a07547ebc840015669e4e75a3: Status 404 returned error can't find the container with id 67d5f2cc8b0ea4777402151b349c4eecad97b73a07547ebc840015669e4e75a3 Apr 16 18:10:16.134855 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.134776 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05edfe7d_7845_4f05_a320_9d359462ba01.slice/crio-b4b374cf360cc3390761f6f8fe5edb406cc3abc17460b897b35d202fb5a68305 WatchSource:0}: Error finding container b4b374cf360cc3390761f6f8fe5edb406cc3abc17460b897b35d202fb5a68305: Status 404 returned error can't find the container with id b4b374cf360cc3390761f6f8fe5edb406cc3abc17460b897b35d202fb5a68305 Apr 16 18:10:16.135918 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.135869 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4454b177_644b_4125_8259_d9aeaf036cf6.slice/crio-f1d3b7a7606116378ad3c739f8bbcac361dfb0639317a9fca73887a0769456e6 WatchSource:0}: Error finding container f1d3b7a7606116378ad3c739f8bbcac361dfb0639317a9fca73887a0769456e6: Status 404 returned error can't find the container with id f1d3b7a7606116378ad3c739f8bbcac361dfb0639317a9fca73887a0769456e6 Apr 16 18:10:16.137041 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.136883 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5806a3c4_be78_414a_9251_725dc0b94d51.slice/crio-5943854d4daede0826fa08dc4f220cde763df669e1370ba0e97bf65d8dd38a78 WatchSource:0}: Error finding container 5943854d4daede0826fa08dc4f220cde763df669e1370ba0e97bf65d8dd38a78: Status 404 returned error can't find the container with id 5943854d4daede0826fa08dc4f220cde763df669e1370ba0e97bf65d8dd38a78 Apr 16 18:10:16.137552 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:16.137479 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dccd9fe_191d_4ac5_ab44_7a88f2d50784.slice/crio-cc32266676d56018c5de92f6048b361348684082642dcd8fc127f7d0bcb62870 WatchSource:0}: Error finding container cc32266676d56018c5de92f6048b361348684082642dcd8fc127f7d0bcb62870: Status 404 returned error can't find the container with id cc32266676d56018c5de92f6048b361348684082642dcd8fc127f7d0bcb62870 Apr 16 18:10:16.239349 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.239314 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:14 +0000 UTC" deadline="2027-11-28 03:25:24.402339389 +0000 UTC" Apr 16 18:10:16.239349 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.239345 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14169h15m8.16299663s" Apr 16 18:10:16.332940 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.332909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:16.333077 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.333055 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:16.342016 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.341963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" event={"ID":"022134da6e351443de7e25a90a6de7a4","Type":"ContainerStarted","Data":"4a8fc8015524527c7cff449347f1a44e0d3e306c75849c760af2212935b38693"} Apr 16 18:10:16.343072 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.343044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" event={"ID":"9dccd9fe-191d-4ac5-ab44-7a88f2d50784","Type":"ContainerStarted","Data":"cc32266676d56018c5de92f6048b361348684082642dcd8fc127f7d0bcb62870"} Apr 16 18:10:16.344045 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.344024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sqdqp" event={"ID":"5806a3c4-be78-414a-9251-725dc0b94d51","Type":"ContainerStarted","Data":"5943854d4daede0826fa08dc4f220cde763df669e1370ba0e97bf65d8dd38a78"} Apr 16 18:10:16.345049 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.345027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4dfv8" event={"ID":"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a","Type":"ContainerStarted","Data":"67d5f2cc8b0ea4777402151b349c4eecad97b73a07547ebc840015669e4e75a3"} Apr 16 18:10:16.346046 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.345971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jp8j9" event={"ID":"d01511ca-4447-40f9-8518-9d2f62898c7a","Type":"ContainerStarted","Data":"f0f9606f13afd6263f1568884949453cb6127a7fbdf46d68d799451bd913fdfb"} Apr 16 18:10:16.346979 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.346960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerStarted","Data":"d27b2dd7d3513e2c0486af0efd117b8fd58296bbe95cd0170d9893f35def6dc0"} Apr 16 18:10:16.349281 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.349254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m6xb7" event={"ID":"05edfe7d-7845-4f05-a320-9d359462ba01","Type":"ContainerStarted","Data":"b4b374cf360cc3390761f6f8fe5edb406cc3abc17460b897b35d202fb5a68305"} Apr 16 18:10:16.350818 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.350800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"f1d3b7a7606116378ad3c739f8bbcac361dfb0639317a9fca73887a0769456e6"} Apr 16 18:10:16.351800 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.351758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t6q6l" event={"ID":"9a3d42b4-5542-4575-a00e-81713e0d9ced","Type":"ContainerStarted","Data":"f17ac7781f9f673b93aef780a2fb35496dd0d0d7fe3bde0f34a2e8a3640230fa"} Apr 16 18:10:16.352586 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.352553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" event={"ID":"a31dbcfe-5679-4c94-9030-4e1442f23cf0","Type":"ContainerStarted","Data":"bd3f1b875151dbeb80ae661f5cc99da64654a38e6a797c03c54b2310c47934b4"} Apr 16 18:10:16.355402 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.355363 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-48.ec2.internal" podStartSLOduration=2.355353383 podStartE2EDuration="2.355353383s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:16.355251156 +0000 UTC m=+3.718372464" watchObservedRunningTime="2026-04-16 18:10:16.355353383 +0000 UTC m=+3.718474681" Apr 16 18:10:16.813686 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.813606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:16.813969 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.813948 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:16.814035 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.814021 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:18.814002545 +0000 UTC m=+6.177123823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:16.914805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:16.914200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:16.914805 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.914386 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:16.914805 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:16.914446 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:18.914427577 +0000 UTC m=+6.277548857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:17.015508 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:17.015470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:17.015682 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.015620 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:17.015682 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.015640 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:17.015682 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.015652 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.015874 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.015733 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:19.015715118 +0000 UTC m=+6.378836414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.333396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:17.333363 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:17.333855 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.333494 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:17.336234 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:17.336211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:17.336347 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:17.336317 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:17.368605 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:17.368571 2574 generic.go:358] "Generic (PLEG): container finished" podID="48a65de737f127de1741ec3d41512a5f" containerID="49355ee6f6f48efed2751bd894eb18f86c8c4e067a2b5d22aaf9ef9665cd2c6b" exitCode=0 Apr 16 18:10:17.369208 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:17.369183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" event={"ID":"48a65de737f127de1741ec3d41512a5f","Type":"ContainerDied","Data":"49355ee6f6f48efed2751bd894eb18f86c8c4e067a2b5d22aaf9ef9665cd2c6b"} Apr 16 18:10:18.333175 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:18.333113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:18.333397 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:18.333252 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:18.381300 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:18.381264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" event={"ID":"48a65de737f127de1741ec3d41512a5f","Type":"ContainerStarted","Data":"b2fecc279c32c8453523634267e58d5eeffadf313008bea9a018c93eafb9d2d8"} Apr 16 18:10:18.831055 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:18.831014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:18.831215 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:18.831188 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:18.831289 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:18.831274 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:22.831252679 +0000 UTC m=+10.194373983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:18.931758 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:18.931658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:18.931913 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:18.931828 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:18.931913 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:18.931899 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:22.931879639 +0000 UTC m=+10.295000920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:19.032964 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:19.032923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:19.033160 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.033096 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:19.033160 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.033115 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:19.033160 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.033128 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:19.033323 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.033190 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:23.03316885 +0000 UTC m=+10.396290149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:19.333554 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:19.333473 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:19.334111 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.333605 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:19.334111 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:19.333685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:19.334111 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:19.333804 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:20.333326 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:20.333292 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:20.333532 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:20.333453 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:21.333638 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:21.333250 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:21.333638 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:21.333383 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:21.333638 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:21.333504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:21.333638 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:21.333593 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:22.333263 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:22.333226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:22.333465 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:22.333372 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:22.866868 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:22.866829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:22.867316 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:22.866988 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:22.867316 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:22.867066 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:30.86704414 +0000 UTC m=+18.230165423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:22.968240 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:22.968200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:22.968440 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:22.968420 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:22.968509 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:22.968495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:30.968475598 +0000 UTC m=+18.331596874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:23.069215 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:23.069173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:23.069384 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.069356 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:23.069384 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.069383 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:23.069519 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.069398 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:23.069519 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.069470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:31.06945116 +0000 UTC m=+18.432572450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:23.334774 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:23.334220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:23.334774 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.334323 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:23.334774 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:23.334371 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:23.334774 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:23.334473 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:24.333409 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:24.333243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:24.333409 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:24.333391 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:25.333335 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:25.333257 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:25.333335 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:25.333283 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:25.333875 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:25.333390 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:25.333875 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:25.333531 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:26.332495 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:26.332459 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:26.332686 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:26.332592 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:27.335324 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:27.335293 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:27.335714 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:27.335293 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:27.335714 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:27.335475 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:27.335714 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:27.335395 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:28.332880 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:28.332848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:28.333059 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:28.332962 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:29.335538 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:29.335508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:29.336002 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:29.335517 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:29.336002 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:29.335634 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:29.336002 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:29.335729 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:30.332496 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:30.332460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:30.332742 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:30.332601 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:30.926173 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:30.926124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:30.926559 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:30.926292 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:30.926559 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:30.926374 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:46.926353583 +0000 UTC m=+34.289474873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:31.027323 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:31.027280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:31.027520 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.027461 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:31.027582 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.027559 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.027517791 +0000 UTC m=+34.390639072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:31.128474 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:31.128443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:31.128645 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.128579 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:31.128645 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.128607 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:31.128645 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.128620 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:31.128792 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.128677 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.128658801 +0000 UTC m=+34.491780085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:31.333323 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:31.333225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:31.333323 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:31.333260 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:31.333546 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.333365 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:31.333602 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:31.333557 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:32.333367 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:32.333319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:32.333835 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:32.333465 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:33.334303 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:33.334270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:33.334663 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:33.334382 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:33.334663 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:33.334431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:33.334663 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:33.334506 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:34.333293 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.333066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:34.333448 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:34.333407 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:34.411229 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.411120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m6xb7" event={"ID":"05edfe7d-7845-4f05-a320-9d359462ba01","Type":"ContainerStarted","Data":"26ee9feb00026a6301b4ec7fda6e76ac4d29afe3cc242a0591d61d1939bb717e"} Apr 16 18:10:34.413339 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413282 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:10:34.413598 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413572 2574 generic.go:358] "Generic (PLEG): container finished" podID="4454b177-644b-4125-8259-d9aeaf036cf6" containerID="0b99e889709f313d9cd8886d775ac05c35085a640e0a8c060adc40ad8d04b70b" exitCode=1 Apr 16 18:10:34.413653 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"195663c2eb3be5e63ac281d90f0f8f917f8b50292d97c1e3ce2cd425078ebb37"} Apr 16 18:10:34.413653 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"27ae51a9c0f4bd2760401c389d2830e65ba2320279626156677d33f691b8dc4c"} Apr 16 18:10:34.413653 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"0e7bed54381a66573e5fc6902c5dd3b0858bf8b83bc7555fcd542c6301e85bc8"} Apr 16 18:10:34.413785 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerDied","Data":"0b99e889709f313d9cd8886d775ac05c35085a640e0a8c060adc40ad8d04b70b"} Apr 16 18:10:34.413785 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.413669 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"e391f943d63bcf1ffc73671e5266c46740a2f2f1e0d828a4527eacb41fb5757d"} Apr 16 18:10:34.414858 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.414839 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" event={"ID":"a31dbcfe-5679-4c94-9030-4e1442f23cf0","Type":"ContainerStarted","Data":"46a10887cdd89ea3d00c5ee2e5b63550a0037d5ea2d06567c2c9f76807297f4c"} Apr 16 18:10:34.416058 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.416041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" event={"ID":"9dccd9fe-191d-4ac5-ab44-7a88f2d50784","Type":"ContainerStarted","Data":"d5cd9a4482b0ffeaf5ade78effa6c8aefdaa5ebd86a5677e4caeb287d08dc2b2"} Apr 16 18:10:34.417152 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.417132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sqdqp" event={"ID":"5806a3c4-be78-414a-9251-725dc0b94d51","Type":"ContainerStarted","Data":"f5c7041b26becb7d6228afde6638b3a38f1ffe64fb20ab899c0a386eec71ef79"} Apr 16 18:10:34.418244 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.418226 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4dfv8" event={"ID":"a22f332b-dce3-4709-8d3e-a6ca1b00bc4a","Type":"ContainerStarted","Data":"d405934a0f06f525c4d91a0f79a7c9449da843d7b2cd983c6e84d87ed06c8aac"} Apr 16 18:10:34.419381 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.419364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jp8j9" event={"ID":"d01511ca-4447-40f9-8518-9d2f62898c7a","Type":"ContainerStarted","Data":"5b0cac593f8187f6ce60f3a7c292c7198f97b87d18e2eb21dda7fbfd63d7874a"} Apr 16 18:10:34.420559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.420540 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="2b659ada560c009689b82d3e8a4bb847d8f751b3c098ce8f8e48f9a01bac7757" exitCode=0 Apr 16 18:10:34.420631 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.420566 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"2b659ada560c009689b82d3e8a4bb847d8f751b3c098ce8f8e48f9a01bac7757"} Apr 16 18:10:34.424233 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.424197 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m6xb7" podStartSLOduration=4.174218901 podStartE2EDuration="21.424172279s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.141150055 +0000 UTC m=+3.504271336" lastFinishedPulling="2026-04-16 18:10:33.391103429 +0000 UTC m=+20.754224714" observedRunningTime="2026-04-16 18:10:34.424149034 +0000 UTC m=+21.787270336" watchObservedRunningTime="2026-04-16 18:10:34.424172279 +0000 UTC m=+21.787293559" Apr 16 18:10:34.424437 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.424414 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-48.ec2.internal" podStartSLOduration=20.424407687 podStartE2EDuration="20.424407687s" podCreationTimestamp="2026-04-16 18:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:18.395909326 +0000 UTC m=+5.759030626" watchObservedRunningTime="2026-04-16 18:10:34.424407687 +0000 UTC m=+21.787528984" Apr 16 18:10:34.458968 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.458930 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jp8j9" podStartSLOduration=3.871295282 podStartE2EDuration="21.458914366s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.135587631 +0000 UTC m=+3.498708922" lastFinishedPulling="2026-04-16 18:10:33.723206718 +0000 UTC m=+21.086328006" observedRunningTime="2026-04-16 18:10:34.458735601 +0000 UTC m=+21.821856901" watchObservedRunningTime="2026-04-16 18:10:34.458914366 +0000 UTC m=+21.822035643" Apr 16 18:10:34.474127 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.474093 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8tjw6" podStartSLOduration=4.214517551 podStartE2EDuration="21.474080746s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.131368488 +0000 UTC m=+3.494489764" lastFinishedPulling="2026-04-16 18:10:33.39093168 +0000 UTC m=+20.754052959" observedRunningTime="2026-04-16 18:10:34.473877699 +0000 UTC m=+21.836998998" watchObservedRunningTime="2026-04-16 18:10:34.474080746 +0000 UTC m=+21.837202044" Apr 16 18:10:34.485520 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.485484 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4dfv8" podStartSLOduration=4.230977309 podStartE2EDuration="21.485472871s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.136578345 +0000 UTC m=+3.499699623" lastFinishedPulling="2026-04-16 18:10:33.391073906 +0000 UTC m=+20.754195185" observedRunningTime="2026-04-16 18:10:34.485038695 +0000 UTC m=+21.848160015" watchObservedRunningTime="2026-04-16 18:10:34.485472871 +0000 UTC m=+21.848594180" Apr 16 18:10:34.501905 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.501846 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sqdqp" podStartSLOduration=4.224960466 podStartE2EDuration="21.501835261s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.141012634 +0000 UTC m=+3.504133917" lastFinishedPulling="2026-04-16 18:10:33.417887419 +0000 UTC m=+20.781008712" observedRunningTime="2026-04-16 18:10:34.501666339 +0000 UTC m=+21.864787640" watchObservedRunningTime="2026-04-16 18:10:34.501835261 +0000 UTC m=+21.864956559" Apr 16 18:10:34.596293 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.596270 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:34.654849 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.654821 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:34.655401 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:34.655385 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:35.261616 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.261443 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:34.596287563Z","UUID":"f0831be1-f883-48f3-a450-5df41dee90ad","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:35.263333 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.263306 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:35.263458 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.263343 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:35.333541 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.333511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:35.333541 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.333511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:35.333777 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:35.333671 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:35.333777 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:35.333690 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:35.423907 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.423870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" event={"ID":"9dccd9fe-191d-4ac5-ab44-7a88f2d50784","Type":"ContainerStarted","Data":"7671711cc6da422b4e0459fb3c6c6a23eb64db769ed4dcb2e08d9db5ffdabb1f"} Apr 16 18:10:35.426869 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.426849 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:10:35.427241 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.427207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"8120e8d6cd66a74f31c3d1b1b470148eec129cefd3adebcdb0aa92899dd24519"} Apr 16 18:10:35.428543 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.428479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t6q6l" event={"ID":"9a3d42b4-5542-4575-a00e-81713e0d9ced","Type":"ContainerStarted","Data":"613e0e3561e7ed434bcf76c4c3e2121db28304d658be4db8973f30f10a890cd8"} Apr 16 18:10:35.429309 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.429288 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:35.429850 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.429835 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m6xb7" Apr 16 18:10:35.442194 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:35.442162 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-t6q6l" podStartSLOduration=5.15337213 podStartE2EDuration="22.442149761s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.13447326 +0000 UTC m=+3.497594548" lastFinishedPulling="2026-04-16 18:10:33.423250901 +0000 UTC m=+20.786372179" observedRunningTime="2026-04-16 18:10:35.441817949 +0000 UTC m=+22.804939249" watchObservedRunningTime="2026-04-16 18:10:35.442149761 +0000 UTC m=+22.805271061" Apr 16 18:10:36.333376 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:36.333182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:36.333541 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:36.333456 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:36.433162 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:36.433130 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" event={"ID":"9dccd9fe-191d-4ac5-ab44-7a88f2d50784","Type":"ContainerStarted","Data":"50c08bb0941a7f950e2c51d60a7a33779150eb934d9e8c3fba7eb54ee11a7b4c"} Apr 16 18:10:36.451891 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:36.451832 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vltcp" podStartSLOduration=4.157069527 podStartE2EDuration="23.451814064s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.140973063 +0000 UTC m=+3.504094354" lastFinishedPulling="2026-04-16 18:10:35.435717598 +0000 UTC m=+22.798838891" observedRunningTime="2026-04-16 18:10:36.451457521 +0000 UTC m=+23.814578846" watchObservedRunningTime="2026-04-16 18:10:36.451814064 +0000 UTC m=+23.814935365" Apr 16 18:10:37.333058 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:37.333022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:37.333275 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:37.333135 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:37.333275 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:37.333203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:37.333402 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:37.333299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:37.438253 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:37.438224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:10:37.438774 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:37.438644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"58912df1459068a3dee4f561d8d52dcf6e60ae62c805ceab22c4de792ef7324c"} Apr 16 18:10:38.332899 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:38.332859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:38.333089 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:38.332994 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:39.332871 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.332639 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:39.333685 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.332711 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:39.333685 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:39.332973 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:39.333685 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:39.333004 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:39.443533 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.443499 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="b5995e59de87f1e6152bb7a9d69d106b244705cb363cfa76079927977cdc5bc9" exitCode=0 Apr 16 18:10:39.443730 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.443584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"b5995e59de87f1e6152bb7a9d69d106b244705cb363cfa76079927977cdc5bc9"} Apr 16 18:10:39.446845 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.446827 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:10:39.447170 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.447149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"2c9d6eaa2d090a64690a3321e097eb0a2609fa7aa17175a3a12bddb69ea978e7"} Apr 16 18:10:39.447437 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.447421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:39.447512 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.447446 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:39.447512 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.447456 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:39.447670 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.447648 2574 scope.go:117] "RemoveContainer" containerID="0b99e889709f313d9cd8886d775ac05c35085a640e0a8c060adc40ad8d04b70b" Apr 16 18:10:39.462632 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.462610 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:39.462922 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:39.462908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:10:40.333271 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.333240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:40.333678 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:40.333360 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:40.452296 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.452222 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:10:40.452595 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.452560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" event={"ID":"4454b177-644b-4125-8259-d9aeaf036cf6","Type":"ContainerStarted","Data":"ef517cb8cca57aa70ea679738ac843b46736d3cfd05dd7c5697e1b6f7072628d"} Apr 16 18:10:40.454445 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.454423 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="168f36ace6be4407e61e6d7c7a09d5693c48922db3e571c636773e8f099d706c" exitCode=0 Apr 16 18:10:40.454522 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.454461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"168f36ace6be4407e61e6d7c7a09d5693c48922db3e571c636773e8f099d706c"} Apr 16 18:10:40.479046 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.478994 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" podStartSLOduration=9.89981282 podStartE2EDuration="27.478981283s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.14113012 +0000 UTC m=+3.504251399" lastFinishedPulling="2026-04-16 18:10:33.720298584 +0000 UTC m=+21.083419862" observedRunningTime="2026-04-16 18:10:40.477530715 +0000 UTC m=+27.840652013" watchObservedRunningTime="2026-04-16 18:10:40.478981283 +0000 UTC m=+27.842102580" Apr 16 18:10:40.550666 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.550635 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7gnqt"] Apr 16 18:10:40.550857 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.550756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:40.550922 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:40.550875 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:40.553157 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.553132 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-56w6g"] Apr 16 18:10:40.553277 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.553226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:40.553336 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:40.553299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:40.556329 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.556306 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9jtfh"] Apr 16 18:10:40.556428 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:40.556392 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:40.556478 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:40.556455 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:41.458221 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:41.458140 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="dc5ed0ec6f513b0598cadfd8d6ceb7d21876400b66ebb7eed5dc46a04f5ff568" exitCode=0 Apr 16 18:10:41.458546 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:41.458215 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"dc5ed0ec6f513b0598cadfd8d6ceb7d21876400b66ebb7eed5dc46a04f5ff568"} Apr 16 18:10:42.333578 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:42.333464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:42.333578 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:42.333499 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:42.333578 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:42.333470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:42.333869 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:42.333623 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:42.333869 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:42.333769 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:42.333869 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:42.333835 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:44.332826 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:44.332787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:44.333346 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:44.332787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:44.333346 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:44.332923 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:44.333346 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:44.332787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:44.333346 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:44.332996 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:44.333346 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:44.333078 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:46.332452 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.332420 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:46.333039 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:46.332537 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9jtfh" podUID="ae61ed6e-f711-4ff6-a33f-c2ff79830f57" Apr 16 18:10:46.333039 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.332563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:46.333039 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:46.332681 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-56w6g" podUID="4f7d4bee-e467-46d2-b0cf-bba98e8cd041" Apr 16 18:10:46.333039 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.332758 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:46.333039 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:46.332880 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7gnqt" podUID="95b4efe3-7d49-40fc-a8e3-4381e92ed949" Apr 16 18:10:46.913575 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.913547 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-48.ec2.internal" event="NodeReady" Apr 16 18:10:46.913794 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.913667 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:46.947545 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.947517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:46.947668 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:46.947620 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:46.947736 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:46.947671 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret podName:4f7d4bee-e467-46d2-b0cf-bba98e8cd041 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.947653494 +0000 UTC m=+66.310774773 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret") pod "global-pull-secret-syncer-56w6g" (UID: "4f7d4bee-e467-46d2-b0cf-bba98e8cd041") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:46.953655 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.953631 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69ff8ff76b-sxx75"] Apr 16 18:10:46.970834 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.970811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:46.970967 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.970602 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69ff8ff76b-sxx75"] Apr 16 18:10:46.971470 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.971452 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d6hfh"] Apr 16 18:10:46.972980 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.972955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:10:46.973093 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.972992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:10:46.973093 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.973068 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9692l\"" Apr 16 18:10:46.973209 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.973098 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:10:46.989784 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.989764 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-thnb8"] Apr 16 18:10:46.989958 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.989939 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:46.990384 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.990363 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:10:46.992319 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.992275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:46.992429 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.992323 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:46.992429 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:46.992412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:10:47.004719 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.004678 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6hfh"] Apr 16 18:10:47.004719 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.004719 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-thnb8"] Apr 16 18:10:47.004890 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.004790 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.006919 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.006898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:47.007026 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.007010 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:47.007336 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.007286 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:10:47.007910 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.007683 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:47.048431 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdv6t\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048559 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048659 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:47.048692 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.048662 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:47.048692 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048773 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048773 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.048728 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:19.048714345 +0000 UTC m=+66.411835624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:47.048843 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.048876 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.048837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.149877 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.149846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150021 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.149891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.150021 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.149916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqqk\" (UniqueName: \"kubernetes.io/projected/0d86e7f9-1a3a-4778-a936-753a6e1ee886-kube-api-access-jqqqk\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.150021 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.149984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d86e7f9-1a3a-4778-a936-753a6e1ee886-tmp-dir\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.150134 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.150134 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkmt\" (UniqueName: \"kubernetes.io/projected/92d4b593-ee95-460f-9517-0583cadeaeb1-kube-api-access-6xkmt\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.150134 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150134 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdv6t\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d86e7f9-1a3a-4778-a936-753a6e1ee886-config-volume\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150308 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150368 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150390 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150403 2574 projected.go:194] Error preparing data for projected volume kube-api-access-84l4f for pod openshift-network-diagnostics/network-check-target-9jtfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f podName:ae61ed6e-f711-4ff6-a33f-c2ff79830f57 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:19.150451625 +0000 UTC m=+66.513572916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l4f" (UniqueName: "kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f") pod "network-check-target-9jtfh" (UID: "ae61ed6e-f711-4ff6-a33f-c2ff79830f57") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150485 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150508 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:10:47.150636 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.150564 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.650546619 +0000 UTC m=+35.013667908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:10:47.150951 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.150744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.151091 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.151070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.151393 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.151372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.154183 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.154162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.154239 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.154210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.158168 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.158146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdv6t\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.158838 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.158814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.251577 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d86e7f9-1a3a-4778-a936-753a6e1ee886-tmp-dir\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.251690 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251588 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.251690 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkmt\" (UniqueName: \"kubernetes.io/projected/92d4b593-ee95-460f-9517-0583cadeaeb1-kube-api-access-6xkmt\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.251690 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d86e7f9-1a3a-4778-a936-753a6e1ee886-config-volume\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.251873 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.251873 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.251799 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:47.251873 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.251819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqqk\" (UniqueName: \"kubernetes.io/projected/0d86e7f9-1a3a-4778-a936-753a6e1ee886-kube-api-access-jqqqk\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.252016 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.251923 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.751857799 +0000 UTC m=+35.114979088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:10:47.252084 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.252059 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:47.252134 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.252118 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:47.752101926 +0000 UTC m=+35.115223219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:10:47.252212 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.252189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d86e7f9-1a3a-4778-a936-753a6e1ee886-config-volume\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.252421 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.252396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0d86e7f9-1a3a-4778-a936-753a6e1ee886-tmp-dir\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.262548 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.262525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqqk\" (UniqueName: \"kubernetes.io/projected/0d86e7f9-1a3a-4778-a936-753a6e1ee886-kube-api-access-jqqqk\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.262726 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.262574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkmt\" (UniqueName: \"kubernetes.io/projected/92d4b593-ee95-460f-9517-0583cadeaeb1-kube-api-access-6xkmt\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.472314 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.472255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerStarted","Data":"d74b0b0d3abcce0dd8609d7a95399af472a8e354c6fcb28923d549da047cafa3"} Apr 16 18:10:47.654587 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.654551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:47.654773 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.654707 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:47.654773 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.654727 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:10:47.654863 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.654775 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:48.654761609 +0000 UTC m=+36.017882884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:10:47.755661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.755583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:47.755806 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:47.755711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:47.755806 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.755748 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:47.755806 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.755790 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:47.755900 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.755808 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:48.755792673 +0000 UTC m=+36.118913957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:10:47.755900 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:47.755836 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:48.755824266 +0000 UTC m=+36.118945547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:10:48.332611 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.332562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:10:48.332611 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.332605 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:10:48.332851 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.332580 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:10:48.335125 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.335107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:48.335193 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.335106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:48.336132 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.336103 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:48.336249 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.336147 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:10:48.336249 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.336175 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:48.336249 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.336103 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:10:48.476127 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.476044 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="d74b0b0d3abcce0dd8609d7a95399af472a8e354c6fcb28923d549da047cafa3" exitCode=0 Apr 16 18:10:48.476127 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.476097 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"d74b0b0d3abcce0dd8609d7a95399af472a8e354c6fcb28923d549da047cafa3"} Apr 16 18:10:48.662236 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.662215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:48.662374 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.662361 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:48.662422 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.662375 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:10:48.662457 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.662425 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:50.662408154 +0000 UTC m=+38.025529432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:10:48.762728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.762635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:48.762728 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:48.762687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:48.762895 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.762808 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:48.762895 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.762812 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:48.762895 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.762860 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:50.762846918 +0000 UTC m=+38.125968194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:10:48.762895 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:48.762873 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:50.762867099 +0000 UTC m=+38.125988375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:10:49.480231 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:49.480201 2574 generic.go:358] "Generic (PLEG): container finished" podID="b00af592-6ad1-4cc6-8dc6-0b46ced5a45c" containerID="4a80c3238175245dbf3390e9856046f0240969af0efc2aeca30bef5d32e8bbf0" exitCode=0 Apr 16 18:10:49.480585 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:49.480255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerDied","Data":"4a80c3238175245dbf3390e9856046f0240969af0efc2aeca30bef5d32e8bbf0"} Apr 16 18:10:50.485365 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:50.485147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" event={"ID":"b00af592-6ad1-4cc6-8dc6-0b46ced5a45c","Type":"ContainerStarted","Data":"1f1836e7cc268b7e28ecb5d9171f15abd5b75a0e1702c618f84b78ddc4127bb9"} Apr 16 18:10:50.510211 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:50.509434 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dg7bs" podStartSLOduration=6.395578148 podStartE2EDuration="37.50941639s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:10:16.1079366 +0000 UTC m=+3.471057880" lastFinishedPulling="2026-04-16 18:10:47.221774835 +0000 UTC m=+34.584896122" observedRunningTime="2026-04-16 18:10:50.507549672 +0000 UTC m=+37.870670972" watchObservedRunningTime="2026-04-16 18:10:50.50941639 +0000 UTC m=+37.872537689" Apr 16 18:10:50.677607 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:50.677574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:50.677786 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.677690 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:50.677786 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.677718 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:10:50.677786 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.677778 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:54.677764962 +0000 UTC m=+42.040886238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:10:50.777953 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:50.777869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:50.777953 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:50.777925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:50.778157 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.778016 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:50.778157 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.778033 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:50.778157 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.778082 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:54.778069239 +0000 UTC m=+42.141190514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:10:50.778157 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:50.778093 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:54.778088064 +0000 UTC m=+42.141209340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:10:54.706663 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:54.706625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:10:54.707052 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.706798 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:10:54.707052 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.706819 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:10:54.707052 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.706887 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.706858523 +0000 UTC m=+50.069979799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:10:54.807515 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:54.807479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:10:54.807667 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:54.807531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:10:54.807667 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.807623 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:10:54.807667 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.807623 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:10:54.807785 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.807674 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.807661462 +0000 UTC m=+50.170782737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:10:54.807785 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:10:54.807689 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.80768125 +0000 UTC m=+50.170802525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:10:58.417716 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.417665 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d"] Apr 16 18:10:58.471376 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.471340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d"] Apr 16 18:10:58.471376 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.471377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.474689 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.474661 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9vrlw\"" Apr 16 18:10:58.474689 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.474684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:10:58.474894 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.474763 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:10:58.474894 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.474775 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:10:58.475011 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.474996 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:10:58.478631 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.478606 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk"] Apr 16 18:10:58.506920 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.506899 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m"] Apr 16 18:10:58.507072 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.507056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.509368 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.509346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:10:58.509490 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.509371 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:10:58.509490 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.509371 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:10:58.509490 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.509376 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:10:58.527822 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.527800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m"] Apr 16 18:10:58.527822 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.527821 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk"] Apr 16 18:10:58.527942 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.527901 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.529814 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.529796 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:10:58.637676 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.637829 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.637829 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-tmp\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.637829 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rl7\" (UniqueName: \"kubernetes.io/projected/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-kube-api-access-z9rl7\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.637829 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-klusterlet-config\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.637966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.637966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637889 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44qq\" (UniqueName: \"kubernetes.io/projected/0cffb940-7bd8-435b-a8ed-139e96765cf0-kube-api-access-b44qq\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.637966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf64d\" (UniqueName: \"kubernetes.io/projected/02a0f007-c6a0-4878-9fb4-f22bb0637984-kube-api-access-mf64d\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.637966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.637966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cffb940-7bd8-435b-a8ed-139e96765cf0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.638113 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.637976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/02a0f007-c6a0-4878-9fb4-f22bb0637984-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738540 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-tmp\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.738661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rl7\" (UniqueName: \"kubernetes.io/projected/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-kube-api-access-z9rl7\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.738661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-klusterlet-config\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b44qq\" (UniqueName: \"kubernetes.io/projected/0cffb940-7bd8-435b-a8ed-139e96765cf0-kube-api-access-b44qq\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf64d\" (UniqueName: \"kubernetes.io/projected/02a0f007-c6a0-4878-9fb4-f22bb0637984-kube-api-access-mf64d\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cffb940-7bd8-435b-a8ed-139e96765cf0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.738866 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/02a0f007-c6a0-4878-9fb4-f22bb0637984-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.739138 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.738967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-tmp\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.739522 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.739501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/02a0f007-c6a0-4878-9fb4-f22bb0637984-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.742137 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.742240 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-klusterlet-config\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.742474 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-ca\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.742518 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.742518 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cffb940-7bd8-435b-a8ed-139e96765cf0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.742518 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.742506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/02a0f007-c6a0-4878-9fb4-f22bb0637984-hub\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.747173 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.747152 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44qq\" (UniqueName: \"kubernetes.io/projected/0cffb940-7bd8-435b-a8ed-139e96765cf0-kube-api-access-b44qq\") pod \"managed-serviceaccount-addon-agent-56d647dbc5-k242d\" (UID: \"0cffb940-7bd8-435b-a8ed-139e96765cf0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.747468 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.747446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf64d\" (UniqueName: \"kubernetes.io/projected/02a0f007-c6a0-4878-9fb4-f22bb0637984-kube-api-access-mf64d\") pod \"cluster-proxy-proxy-agent-6d757d5885-gm9sk\" (UID: \"02a0f007-c6a0-4878-9fb4-f22bb0637984\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.747867 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.747850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rl7\" (UniqueName: \"kubernetes.io/projected/0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f-kube-api-access-z9rl7\") pod \"klusterlet-addon-workmgr-f7d7db7cc-p565m\" (UID: \"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.788760 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.788710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" Apr 16 18:10:58.815490 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.815459 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:10:58.853893 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.853842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:10:58.984126 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.984093 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d"] Apr 16 18:10:58.986977 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:58.986950 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cffb940_7bd8_435b_a8ed_139e96765cf0.slice/crio-bfe0705ff5bb7ee8578f4157e2882a738fceabd61e41604394a0243f8ad64032 WatchSource:0}: Error finding container bfe0705ff5bb7ee8578f4157e2882a738fceabd61e41604394a0243f8ad64032: Status 404 returned error can't find the container with id bfe0705ff5bb7ee8578f4157e2882a738fceabd61e41604394a0243f8ad64032 Apr 16 18:10:58.999561 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:58.999529 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk"] Apr 16 18:10:59.003634 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:59.003604 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a0f007_c6a0_4878_9fb4_f22bb0637984.slice/crio-94be200828c0ff7df40f5ad50ee0e659f9f57a548f92948a28cbdc3ff1cd85b9 WatchSource:0}: Error finding container 94be200828c0ff7df40f5ad50ee0e659f9f57a548f92948a28cbdc3ff1cd85b9: Status 404 returned error can't find the container with id 94be200828c0ff7df40f5ad50ee0e659f9f57a548f92948a28cbdc3ff1cd85b9 Apr 16 18:10:59.020962 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:59.020912 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m"] Apr 16 18:10:59.032982 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:10:59.032960 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1d1ebd_92a6_4b10_acf7_ce30b3e87b6f.slice/crio-a88f615705df971b733c61975148a0cba9a6104fdb2ddc7c475447d789dbb019 WatchSource:0}: Error finding container a88f615705df971b733c61975148a0cba9a6104fdb2ddc7c475447d789dbb019: Status 404 returned error can't find the container with id a88f615705df971b733c61975148a0cba9a6104fdb2ddc7c475447d789dbb019 Apr 16 18:10:59.502483 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:59.502437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" event={"ID":"0cffb940-7bd8-435b-a8ed-139e96765cf0","Type":"ContainerStarted","Data":"bfe0705ff5bb7ee8578f4157e2882a738fceabd61e41604394a0243f8ad64032"} Apr 16 18:10:59.503889 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:59.503857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" event={"ID":"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f","Type":"ContainerStarted","Data":"a88f615705df971b733c61975148a0cba9a6104fdb2ddc7c475447d789dbb019"} Apr 16 18:10:59.505201 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:10:59.505176 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerStarted","Data":"94be200828c0ff7df40f5ad50ee0e659f9f57a548f92948a28cbdc3ff1cd85b9"} Apr 16 18:11:02.773600 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:02.773562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:11:02.774084 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.773749 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:02.774084 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.773774 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:11:02.774084 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.773847 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.773824437 +0000 UTC m=+66.136945730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:11:02.874805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:02.874768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:11:02.874949 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:02.874818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:11:02.874949 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.874913 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:02.874949 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.874928 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:02.875044 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.874962 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.874949625 +0000 UTC m=+66.238070905 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:11:02.875044 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:02.874982 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:18.874968454 +0000 UTC m=+66.238089730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:11:04.517711 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.517623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" event={"ID":"0cffb940-7bd8-435b-a8ed-139e96765cf0","Type":"ContainerStarted","Data":"0fe0b38aba9da3e32cbd4bf26f34c434eba9d1e38e3fcfabe9e8cd149036c4de"} Apr 16 18:11:04.519054 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.519025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" event={"ID":"0d1d1ebd-92a6-4b10-acf7-ce30b3e87b6f","Type":"ContainerStarted","Data":"b9bb4da576f868d9e33dc4c4b2c14ebb5b69bab1a9a7410ba4f7cd362716d1ea"} Apr 16 18:11:04.519235 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.519201 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:11:04.520457 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.520428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerStarted","Data":"f7d2e6d63f98b937a2d514d38ce6c42ed81a2aa851e2e79a5b35a37e87fd67e3"} Apr 16 18:11:04.520535 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.520488 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" Apr 16 18:11:04.534758 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.534720 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56d647dbc5-k242d" podStartSLOduration=1.222777886 podStartE2EDuration="6.534688181s" podCreationTimestamp="2026-04-16 18:10:58 +0000 UTC" firstStartedPulling="2026-04-16 18:10:58.988985788 +0000 UTC m=+46.352107064" lastFinishedPulling="2026-04-16 18:11:04.300896073 +0000 UTC m=+51.664017359" observedRunningTime="2026-04-16 18:11:04.533747769 +0000 UTC m=+51.896869085" watchObservedRunningTime="2026-04-16 18:11:04.534688181 +0000 UTC m=+51.897809522" Apr 16 18:11:04.553334 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:04.553294 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f7d7db7cc-p565m" podStartSLOduration=1.271796001 podStartE2EDuration="6.553281722s" podCreationTimestamp="2026-04-16 18:10:58 +0000 UTC" firstStartedPulling="2026-04-16 18:10:59.034548255 +0000 UTC m=+46.397669532" lastFinishedPulling="2026-04-16 18:11:04.316033974 +0000 UTC m=+51.679155253" observedRunningTime="2026-04-16 18:11:04.552716833 +0000 UTC m=+51.915838145" watchObservedRunningTime="2026-04-16 18:11:04.553281722 +0000 UTC m=+51.916403020" Apr 16 18:11:06.526452 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:06.526426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerStarted","Data":"a71d94d79179ed600fe71d7449cdc91f20743eb3ddcd4f6244422f504422dd4c"} Apr 16 18:11:07.530277 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:07.530234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerStarted","Data":"0ff8defbcd2528404f946be7716877e74aa318bd4632dd79f4a5fa2da2fe2d80"} Apr 16 18:11:07.549612 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:07.549561 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" podStartSLOduration=2.119372667 podStartE2EDuration="9.549547095s" podCreationTimestamp="2026-04-16 18:10:58 +0000 UTC" firstStartedPulling="2026-04-16 18:10:59.005484356 +0000 UTC m=+46.368605635" lastFinishedPulling="2026-04-16 18:11:06.435658784 +0000 UTC m=+53.798780063" observedRunningTime="2026-04-16 18:11:07.548559445 +0000 UTC m=+54.911685431" watchObservedRunningTime="2026-04-16 18:11:07.549547095 +0000 UTC m=+54.912668392" Apr 16 18:11:11.469226 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:11.469195 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbm6k" Apr 16 18:11:18.793678 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:18.793642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:11:18.794066 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.793791 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:18.794066 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.793810 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:11:18.794066 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.793869 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:50.793847709 +0000 UTC m=+98.156968986 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:11:18.894123 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:18.894088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:11:18.894279 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:18.894132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:11:18.894279 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.894228 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:18.894279 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.894229 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:18.894389 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.894286 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:50.894272211 +0000 UTC m=+98.257393486 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:11:18.894389 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:18.894299 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:50.894293193 +0000 UTC m=+98.257414469 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:11:18.994641 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:18.994600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:11:18.996974 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:18.996955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:11:19.007656 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.007632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4f7d4bee-e467-46d2-b0cf-bba98e8cd041-original-pull-secret\") pod \"global-pull-secret-syncer-56w6g\" (UID: \"4f7d4bee-e467-46d2-b0cf-bba98e8cd041\") " pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:11:19.095359 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.095279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:11:19.097362 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.097344 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:19.105950 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:19.105933 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:11:19.106024 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:19.105985 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs podName:95b4efe3-7d49-40fc-a8e3-4381e92ed949 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:23.105970175 +0000 UTC m=+130.469091451 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs") pod "network-metrics-daemon-7gnqt" (UID: "95b4efe3-7d49-40fc-a8e3-4381e92ed949") : secret "metrics-daemon-secret" not found Apr 16 18:11:19.196513 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.196479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:11:19.198747 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.198726 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:19.209133 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.209114 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:19.220023 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.219997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84l4f\" (UniqueName: \"kubernetes.io/projected/ae61ed6e-f711-4ff6-a33f-c2ff79830f57-kube-api-access-84l4f\") pod \"network-check-target-9jtfh\" (UID: \"ae61ed6e-f711-4ff6-a33f-c2ff79830f57\") " pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:11:19.244658 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.244640 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rbrbw\"" Apr 16 18:11:19.252540 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.252524 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-56w6g" Apr 16 18:11:19.252805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.252789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:11:19.398910 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.398877 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-56w6g"] Apr 16 18:11:19.402622 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:11:19.402597 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7d4bee_e467_46d2_b0cf_bba98e8cd041.slice/crio-9bd629fb72b14a3d7a74d311ff380eed0cf812f914d178114ab0395a5459a178 WatchSource:0}: Error finding container 9bd629fb72b14a3d7a74d311ff380eed0cf812f914d178114ab0395a5459a178: Status 404 returned error can't find the container with id 9bd629fb72b14a3d7a74d311ff380eed0cf812f914d178114ab0395a5459a178 Apr 16 18:11:19.412593 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.412568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9jtfh"] Apr 16 18:11:19.415453 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:11:19.415429 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae61ed6e_f711_4ff6_a33f_c2ff79830f57.slice/crio-bd410564a651a946f2923375c430884cff3cb0ca644b56747163e9cc452f8e6a WatchSource:0}: Error finding container bd410564a651a946f2923375c430884cff3cb0ca644b56747163e9cc452f8e6a: Status 404 returned error can't find the container with id bd410564a651a946f2923375c430884cff3cb0ca644b56747163e9cc452f8e6a Apr 16 18:11:19.552955 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.552919 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9jtfh" event={"ID":"ae61ed6e-f711-4ff6-a33f-c2ff79830f57","Type":"ContainerStarted","Data":"bd410564a651a946f2923375c430884cff3cb0ca644b56747163e9cc452f8e6a"} Apr 16 18:11:19.553836 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:19.553812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-56w6g" event={"ID":"4f7d4bee-e467-46d2-b0cf-bba98e8cd041","Type":"ContainerStarted","Data":"9bd629fb72b14a3d7a74d311ff380eed0cf812f914d178114ab0395a5459a178"} Apr 16 18:11:24.567506 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:24.567471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9jtfh" event={"ID":"ae61ed6e-f711-4ff6-a33f-c2ff79830f57","Type":"ContainerStarted","Data":"06e0ea37689323c38217be5da03e0fc6e9d96c399024b2939af62552f2fffea3"} Apr 16 18:11:24.567962 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:24.567545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:11:24.568657 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:24.568638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-56w6g" event={"ID":"4f7d4bee-e467-46d2-b0cf-bba98e8cd041","Type":"ContainerStarted","Data":"e36712eb9b852c9917ed843d59b5abebaaac80b1ddd25c02e82ada2f90084a2e"} Apr 16 18:11:24.584819 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:24.584772 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9jtfh" podStartSLOduration=67.401590588 podStartE2EDuration="1m11.584758292s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:11:19.417239918 +0000 UTC m=+66.780361196" lastFinishedPulling="2026-04-16 18:11:23.600407621 +0000 UTC m=+70.963528900" observedRunningTime="2026-04-16 18:11:24.584044047 +0000 UTC m=+71.947165344" watchObservedRunningTime="2026-04-16 18:11:24.584758292 +0000 UTC m=+71.947879591" Apr 16 18:11:24.604582 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:24.604546 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-56w6g" podStartSLOduration=67.404603042 podStartE2EDuration="1m11.604534667s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:11:19.404473374 +0000 UTC m=+66.767594650" lastFinishedPulling="2026-04-16 18:11:23.604404936 +0000 UTC m=+70.967526275" observedRunningTime="2026-04-16 18:11:24.603273164 +0000 UTC m=+71.966394463" watchObservedRunningTime="2026-04-16 18:11:24.604534667 +0000 UTC m=+71.967655965" Apr 16 18:11:46.323533 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.323417 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh"] Apr 16 18:11:46.330253 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.330234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" Apr 16 18:11:46.332436 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.332407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cs6mb\"" Apr 16 18:11:46.333628 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.333604 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh"] Apr 16 18:11:46.404378 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.404341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2b8z\" (UniqueName: \"kubernetes.io/projected/7c5a1963-90d7-47a3-9562-1945dd539cd4-kube-api-access-g2b8z\") pod \"network-check-source-7b678d77c7-hkxvh\" (UID: \"7c5a1963-90d7-47a3-9562-1945dd539cd4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" Apr 16 18:11:46.504808 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.504771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2b8z\" (UniqueName: \"kubernetes.io/projected/7c5a1963-90d7-47a3-9562-1945dd539cd4-kube-api-access-g2b8z\") pod \"network-check-source-7b678d77c7-hkxvh\" (UID: \"7c5a1963-90d7-47a3-9562-1945dd539cd4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" Apr 16 18:11:46.512716 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.512675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2b8z\" (UniqueName: \"kubernetes.io/projected/7c5a1963-90d7-47a3-9562-1945dd539cd4-kube-api-access-g2b8z\") pod \"network-check-source-7b678d77c7-hkxvh\" (UID: \"7c5a1963-90d7-47a3-9562-1945dd539cd4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" Apr 16 18:11:46.639928 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.639843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" Apr 16 18:11:46.755444 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:46.755416 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh"] Apr 16 18:11:46.759093 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:11:46.759062 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5a1963_90d7_47a3_9562_1945dd539cd4.slice/crio-a0c6c2a98b6290a358e8667bf216aded799b6e9ad75fbbbd16aacc4d66b240b5 WatchSource:0}: Error finding container a0c6c2a98b6290a358e8667bf216aded799b6e9ad75fbbbd16aacc4d66b240b5: Status 404 returned error can't find the container with id a0c6c2a98b6290a358e8667bf216aded799b6e9ad75fbbbd16aacc4d66b240b5 Apr 16 18:11:47.582713 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:47.582673 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sqdqp_5806a3c4-be78-414a-9251-725dc0b94d51/dns-node-resolver/0.log" Apr 16 18:11:47.621377 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:47.621338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" event={"ID":"7c5a1963-90d7-47a3-9562-1945dd539cd4","Type":"ContainerStarted","Data":"4f4a379ea2a3977e3e6ab870136ba856b84efb140db76a79ce5a4e59aac1fa19"} Apr 16 18:11:47.621377 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:47.621375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" event={"ID":"7c5a1963-90d7-47a3-9562-1945dd539cd4","Type":"ContainerStarted","Data":"a0c6c2a98b6290a358e8667bf216aded799b6e9ad75fbbbd16aacc4d66b240b5"} Apr 16 18:11:47.634954 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:47.634909 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hkxvh" podStartSLOduration=1.6348968350000002 podStartE2EDuration="1.634896835s" podCreationTimestamp="2026-04-16 18:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:47.63444628 +0000 UTC m=+94.997567582" watchObservedRunningTime="2026-04-16 18:11:47.634896835 +0000 UTC m=+94.998018132" Apr 16 18:11:48.181625 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:48.181602 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4dfv8_a22f332b-dce3-4709-8d3e-a6ca1b00bc4a/node-ca/0.log" Apr 16 18:11:50.837611 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:50.837570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") pod \"image-registry-69ff8ff76b-sxx75\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:11:50.838012 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.837715 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:11:50.838012 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.837728 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69ff8ff76b-sxx75: secret "image-registry-tls" not found Apr 16 18:11:50.838012 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.837779 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls podName:2f002970-ca6d-4ffd-b054-caec8b6e0479 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:54.837765594 +0000 UTC m=+162.200886869 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls") pod "image-registry-69ff8ff76b-sxx75" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479") : secret "image-registry-tls" not found Apr 16 18:11:50.938312 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:50.938272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:11:50.938312 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:50.938316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:11:50.938489 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.938406 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:11:50.938489 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.938420 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:11:50.938489 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.938455 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert podName:92d4b593-ee95-460f-9517-0583cadeaeb1 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:54.938442708 +0000 UTC m=+162.301563983 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert") pod "ingress-canary-thnb8" (UID: "92d4b593-ee95-460f-9517-0583cadeaeb1") : secret "canary-serving-cert" not found Apr 16 18:11:50.938489 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:11:50.938476 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls podName:0d86e7f9-1a3a-4778-a936-753a6e1ee886 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:54.938461342 +0000 UTC m=+162.301582618 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls") pod "dns-default-d6hfh" (UID: "0d86e7f9-1a3a-4778-a936-753a6e1ee886") : secret "dns-default-metrics-tls" not found Apr 16 18:11:55.573542 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:11:55.573510 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9jtfh" Apr 16 18:12:06.141658 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.141628 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-48x9c"] Apr 16 18:12:06.144862 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.144841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.147947 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.147910 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:12:06.148075 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.147948 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:12:06.148075 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.147995 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7j8vd\"" Apr 16 18:12:06.148075 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.147918 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:12:06.148075 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.147917 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:12:06.155186 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.155164 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-48x9c"] Apr 16 18:12:06.259201 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.259164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.259201 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.259202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-crio-socket\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.259468 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.259228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.259468 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.259385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps967\" (UniqueName: \"kubernetes.io/projected/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-api-access-ps967\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.259468 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.259431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-data-volume\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.360619 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps967\" (UniqueName: \"kubernetes.io/projected/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-api-access-ps967\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.360821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-data-volume\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.360821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.360821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-crio-socket\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.360821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.361009 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.360860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-crio-socket\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.361047 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.361019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-data-volume\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.361363 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.361346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.363012 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.362996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.378467 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.378439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps967\" (UniqueName: \"kubernetes.io/projected/15d0a19e-ca6a-4564-bc4c-88b2d00b2e25-kube-api-access-ps967\") pod \"insights-runtime-extractor-48x9c\" (UID: \"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25\") " pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.454346 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.454258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-48x9c" Apr 16 18:12:06.577553 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.575243 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-48x9c"] Apr 16 18:12:06.580455 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:12:06.580430 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d0a19e_ca6a_4564_bc4c_88b2d00b2e25.slice/crio-a015a241ca5b049c565fd4f0e3fc7c62587103727c169df0b162b5f8cab878d2 WatchSource:0}: Error finding container a015a241ca5b049c565fd4f0e3fc7c62587103727c169df0b162b5f8cab878d2: Status 404 returned error can't find the container with id a015a241ca5b049c565fd4f0e3fc7c62587103727c169df0b162b5f8cab878d2 Apr 16 18:12:06.666202 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.666174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48x9c" event={"ID":"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25","Type":"ContainerStarted","Data":"7972479b2991c2808e25d457fd2644ad9dfed90e4e4b5af80bc822dc221b79ba"} Apr 16 18:12:06.666317 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:06.666210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48x9c" event={"ID":"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25","Type":"ContainerStarted","Data":"a015a241ca5b049c565fd4f0e3fc7c62587103727c169df0b162b5f8cab878d2"} Apr 16 18:12:07.670229 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:07.670192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48x9c" event={"ID":"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25","Type":"ContainerStarted","Data":"f5a76f8d4d30f01657d4e3a20385297e007a8ed353d289a303b73d946af1960c"} Apr 16 18:12:08.676212 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:08.676185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-48x9c" event={"ID":"15d0a19e-ca6a-4564-bc4c-88b2d00b2e25","Type":"ContainerStarted","Data":"baa56d56c4ace56981d627492d923e9c492b63371d4d471ed7e66f9d7fd19f6b"} Apr 16 18:12:08.699322 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:08.699273 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-48x9c" podStartSLOduration=0.75267267 podStartE2EDuration="2.699258783s" podCreationTimestamp="2026-04-16 18:12:06 +0000 UTC" firstStartedPulling="2026-04-16 18:12:06.658810598 +0000 UTC m=+114.021931874" lastFinishedPulling="2026-04-16 18:12:08.605396709 +0000 UTC m=+115.968517987" observedRunningTime="2026-04-16 18:12:08.699069409 +0000 UTC m=+116.062190707" watchObservedRunningTime="2026-04-16 18:12:08.699258783 +0000 UTC m=+116.062380081" Apr 16 18:12:13.415161 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.415124 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-74lpz"] Apr 16 18:12:13.418426 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.418410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.420273 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.420250 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:12:13.420533 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.420517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:12:13.420898 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.420880 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gwlxm\"" Apr 16 18:12:13.420957 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.420886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:12:13.421065 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.421045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:12:13.421149 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.421078 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:12:13.421149 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.421089 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:12:13.513457 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-sys\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513463 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-wtmp\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8scdw\" (UniqueName: \"kubernetes.io/projected/c7c69288-1997-4e93-9019-58347e21e6df-kube-api-access-8scdw\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-accelerators-collector-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-root\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513599 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513580 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-metrics-client-ca\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513809 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-textfile\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.513809 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.513651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615047 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-accelerators-collector-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615198 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-root\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615198 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-metrics-client-ca\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615198 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-textfile\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615198 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-sys\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-wtmp\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-root\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:12:13.615287 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:12:13.615346 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls podName:c7c69288-1997-4e93-9019-58347e21e6df nodeName:}" failed. No retries permitted until 2026-04-16 18:12:14.115324566 +0000 UTC m=+121.478445842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls") pod "node-exporter-74lpz" (UID: "c7c69288-1997-4e93-9019-58347e21e6df") : secret "node-exporter-tls" not found Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8scdw\" (UniqueName: \"kubernetes.io/projected/c7c69288-1997-4e93-9019-58347e21e6df-kube-api-access-8scdw\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-wtmp\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615396 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7c69288-1997-4e93-9019-58347e21e6df-sys\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-textfile\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-accelerators-collector-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.615805 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.615774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7c69288-1997-4e93-9019-58347e21e6df-metrics-client-ca\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.617627 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.617599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:13.623391 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:13.623369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8scdw\" (UniqueName: \"kubernetes.io/projected/c7c69288-1997-4e93-9019-58347e21e6df-kube-api-access-8scdw\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:14.119367 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:14.119333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:14.121503 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:14.121474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7c69288-1997-4e93-9019-58347e21e6df-node-exporter-tls\") pod \"node-exporter-74lpz\" (UID: \"c7c69288-1997-4e93-9019-58347e21e6df\") " pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:14.327775 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:14.327746 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74lpz" Apr 16 18:12:14.336222 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:12:14.336196 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c69288_1997_4e93_9019_58347e21e6df.slice/crio-74bd8b4c3f45788adda03e12411468267160ff0eedcb384d21780a739486ef0e WatchSource:0}: Error finding container 74bd8b4c3f45788adda03e12411468267160ff0eedcb384d21780a739486ef0e: Status 404 returned error can't find the container with id 74bd8b4c3f45788adda03e12411468267160ff0eedcb384d21780a739486ef0e Apr 16 18:12:14.692161 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:14.692124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74lpz" event={"ID":"c7c69288-1997-4e93-9019-58347e21e6df","Type":"ContainerStarted","Data":"74bd8b4c3f45788adda03e12411468267160ff0eedcb384d21780a739486ef0e"} Apr 16 18:12:15.695777 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:15.695741 2574 generic.go:358] "Generic (PLEG): container finished" podID="c7c69288-1997-4e93-9019-58347e21e6df" containerID="22bb6911a509a78eacf231338ecd00ca3604c6ba3bb930ccbdc22ad7d9e6adf6" exitCode=0 Apr 16 18:12:15.696153 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:15.695784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74lpz" event={"ID":"c7c69288-1997-4e93-9019-58347e21e6df","Type":"ContainerDied","Data":"22bb6911a509a78eacf231338ecd00ca3604c6ba3bb930ccbdc22ad7d9e6adf6"} Apr 16 18:12:16.699366 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:16.699331 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74lpz" event={"ID":"c7c69288-1997-4e93-9019-58347e21e6df","Type":"ContainerStarted","Data":"be833514a04f7baa0fb6f6aad3e9b48c1a5f590283f35bf8258b893b8fa3e2b8"} Apr 16 18:12:16.699366 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:16.699366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74lpz" event={"ID":"c7c69288-1997-4e93-9019-58347e21e6df","Type":"ContainerStarted","Data":"6cf1723135bbc3ec15dfa077b597092e788d8b55beb28f660cbc95694d61a2cf"} Apr 16 18:12:16.717484 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:16.717429 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-74lpz" podStartSLOduration=3.01302129 podStartE2EDuration="3.717415363s" podCreationTimestamp="2026-04-16 18:12:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:14.338297375 +0000 UTC m=+121.701418663" lastFinishedPulling="2026-04-16 18:12:15.042691455 +0000 UTC m=+122.405812736" observedRunningTime="2026-04-16 18:12:16.715949585 +0000 UTC m=+124.079071176" watchObservedRunningTime="2026-04-16 18:12:16.717415363 +0000 UTC m=+124.080536661" Apr 16 18:12:23.194320 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.194262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:12:23.196512 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.196493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95b4efe3-7d49-40fc-a8e3-4381e92ed949-metrics-certs\") pod \"network-metrics-daemon-7gnqt\" (UID: \"95b4efe3-7d49-40fc-a8e3-4381e92ed949\") " pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:12:23.449986 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.449905 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xdm8g\"" Apr 16 18:12:23.458365 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.458346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7gnqt" Apr 16 18:12:23.576182 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.576155 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7gnqt"] Apr 16 18:12:23.579374 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:12:23.579334 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b4efe3_7d49_40fc_a8e3_4381e92ed949.slice/crio-58191d930c9562330ddf877178863a97a622e4602c287465ef93deb3c28db053 WatchSource:0}: Error finding container 58191d930c9562330ddf877178863a97a622e4602c287465ef93deb3c28db053: Status 404 returned error can't find the container with id 58191d930c9562330ddf877178863a97a622e4602c287465ef93deb3c28db053 Apr 16 18:12:23.717100 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:23.717019 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gnqt" event={"ID":"95b4efe3-7d49-40fc-a8e3-4381e92ed949","Type":"ContainerStarted","Data":"58191d930c9562330ddf877178863a97a622e4602c287465ef93deb3c28db053"} Apr 16 18:12:24.723783 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:24.723744 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gnqt" event={"ID":"95b4efe3-7d49-40fc-a8e3-4381e92ed949","Type":"ContainerStarted","Data":"ee1d5e76defbab6926bd629599b1549c2ef56d536924b2259e1c12827e5382c4"} Apr 16 18:12:25.728004 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:25.727967 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7gnqt" event={"ID":"95b4efe3-7d49-40fc-a8e3-4381e92ed949","Type":"ContainerStarted","Data":"db0ed12de3d8a4b23da7d811db7bd29eb8b378efa4c0ef8c9fc8542b8c867758"} Apr 16 18:12:25.744175 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:25.744122 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7gnqt" podStartSLOduration=131.768368916 podStartE2EDuration="2m12.74410726s" podCreationTimestamp="2026-04-16 18:10:13 +0000 UTC" firstStartedPulling="2026-04-16 18:12:23.581260847 +0000 UTC m=+130.944382124" lastFinishedPulling="2026-04-16 18:12:24.556999188 +0000 UTC m=+131.920120468" observedRunningTime="2026-04-16 18:12:25.742766305 +0000 UTC m=+133.105887604" watchObservedRunningTime="2026-04-16 18:12:25.74410726 +0000 UTC m=+133.107228557" Apr 16 18:12:38.816983 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:38.816922 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" podUID="02a0f007-c6a0-4878-9fb4-f22bb0637984" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:39.691340 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.691305 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69ff8ff76b-sxx75"] Apr 16 18:12:39.691524 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:12:39.691505 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" podUID="2f002970-ca6d-4ffd-b054-caec8b6e0479" Apr 16 18:12:39.764023 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.763992 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:12:39.768193 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.768168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:12:39.830192 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830165 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830200 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830223 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830250 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830345 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdv6t\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830399 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830456 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets\") pod \"2f002970-ca6d-4ffd-b054-caec8b6e0479\" (UID: \"2f002970-ca6d-4ffd-b054-caec8b6e0479\") " Apr 16 18:12:39.830637 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830578 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:39.830943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830722 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-certificates\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.830943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830749 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:39.830943 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.830789 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:39.832733 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.832684 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:39.832733 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.832719 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t" (OuterVolumeSpecName: "kube-api-access-bdv6t") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "kube-api-access-bdv6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:39.832900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.832876 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:39.832938 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.832924 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2f002970-ca6d-4ffd-b054-caec8b6e0479" (UID: "2f002970-ca6d-4ffd-b054-caec8b6e0479"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:39.931421 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931381 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bdv6t\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-kube-api-access-bdv6t\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.931421 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931409 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f002970-ca6d-4ffd-b054-caec8b6e0479-ca-trust-extracted\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.931421 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931419 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-installation-pull-secrets\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.931421 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931428 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f002970-ca6d-4ffd-b054-caec8b6e0479-trusted-ca\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.931661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931438 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f002970-ca6d-4ffd-b054-caec8b6e0479-image-registry-private-configuration\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:39.931661 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:39.931449 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-bound-sa-token\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:40.766158 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:40.766119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69ff8ff76b-sxx75" Apr 16 18:12:40.806604 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:40.806566 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69ff8ff76b-sxx75"] Apr 16 18:12:40.811485 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:40.811459 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69ff8ff76b-sxx75"] Apr 16 18:12:40.939447 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:40.939425 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f002970-ca6d-4ffd-b054-caec8b6e0479-registry-tls\") on node \"ip-10-0-143-48.ec2.internal\" DevicePath \"\"" Apr 16 18:12:41.337080 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:41.337049 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f002970-ca6d-4ffd-b054-caec8b6e0479" path="/var/lib/kubelet/pods/2f002970-ca6d-4ffd-b054-caec8b6e0479/volumes" Apr 16 18:12:48.816821 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:48.816774 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" podUID="02a0f007-c6a0-4878-9fb4-f22bb0637984" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:49.999370 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:12:49.999324 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-d6hfh" podUID="0d86e7f9-1a3a-4778-a936-753a6e1ee886" Apr 16 18:12:50.013594 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:12:50.013571 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-thnb8" podUID="92d4b593-ee95-460f-9517-0583cadeaeb1" Apr 16 18:12:50.790085 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:50.790055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:12:50.790251 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:50.790207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6hfh" Apr 16 18:12:54.944976 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.944945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:12:54.945357 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.944995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:12:54.947311 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.947281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d86e7f9-1a3a-4778-a936-753a6e1ee886-metrics-tls\") pod \"dns-default-d6hfh\" (UID: \"0d86e7f9-1a3a-4778-a936-753a6e1ee886\") " pod="openshift-dns/dns-default-d6hfh" Apr 16 18:12:54.947564 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.947541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92d4b593-ee95-460f-9517-0583cadeaeb1-cert\") pod \"ingress-canary-thnb8\" (UID: \"92d4b593-ee95-460f-9517-0583cadeaeb1\") " pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:12:54.993238 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.993211 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n88vn\"" Apr 16 18:12:54.993770 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:54.993735 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kd8rq\"" Apr 16 18:12:55.001247 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.001231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-thnb8" Apr 16 18:12:55.001333 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.001321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6hfh" Apr 16 18:12:55.128764 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.128733 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6hfh"] Apr 16 18:12:55.131787 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:12:55.131758 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d86e7f9_1a3a_4778_a936_753a6e1ee886.slice/crio-437d981e091f277340865b041a9a1b4e2f165aec10cabd7756c76af95e3192ac WatchSource:0}: Error finding container 437d981e091f277340865b041a9a1b4e2f165aec10cabd7756c76af95e3192ac: Status 404 returned error can't find the container with id 437d981e091f277340865b041a9a1b4e2f165aec10cabd7756c76af95e3192ac Apr 16 18:12:55.145167 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.145146 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-thnb8"] Apr 16 18:12:55.147948 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:12:55.147916 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d4b593_ee95_460f_9517_0583cadeaeb1.slice/crio-5ea6f063639011e96c5bb11648df83e202a886d09106bb56cbe932a40cf40281 WatchSource:0}: Error finding container 5ea6f063639011e96c5bb11648df83e202a886d09106bb56cbe932a40cf40281: Status 404 returned error can't find the container with id 5ea6f063639011e96c5bb11648df83e202a886d09106bb56cbe932a40cf40281 Apr 16 18:12:55.803475 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.803436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-thnb8" event={"ID":"92d4b593-ee95-460f-9517-0583cadeaeb1","Type":"ContainerStarted","Data":"5ea6f063639011e96c5bb11648df83e202a886d09106bb56cbe932a40cf40281"} Apr 16 18:12:55.804660 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:55.804622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6hfh" event={"ID":"0d86e7f9-1a3a-4778-a936-753a6e1ee886","Type":"ContainerStarted","Data":"437d981e091f277340865b041a9a1b4e2f165aec10cabd7756c76af95e3192ac"} Apr 16 18:12:57.811286 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.811237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-thnb8" event={"ID":"92d4b593-ee95-460f-9517-0583cadeaeb1","Type":"ContainerStarted","Data":"82fe3cf79a8b7375acf68215db562c03bd07d59a0affc7e4e92722d32d08f5da"} Apr 16 18:12:57.812658 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.812637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6hfh" event={"ID":"0d86e7f9-1a3a-4778-a936-753a6e1ee886","Type":"ContainerStarted","Data":"ae7c972a145ab5cee204804210b4251f7ab7525f59dfa976542a79e359044173"} Apr 16 18:12:57.812765 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.812665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6hfh" event={"ID":"0d86e7f9-1a3a-4778-a936-753a6e1ee886","Type":"ContainerStarted","Data":"cb3ea3b05af04c7dcbde11bdf9c129ff49607f4f7a5eeb7b6cd5be71b2381306"} Apr 16 18:12:57.812808 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.812764 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-d6hfh" Apr 16 18:12:57.824567 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.824526 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-thnb8" podStartSLOduration=129.959541694 podStartE2EDuration="2m11.824513924s" podCreationTimestamp="2026-04-16 18:10:46 +0000 UTC" firstStartedPulling="2026-04-16 18:12:55.150022388 +0000 UTC m=+162.513143677" lastFinishedPulling="2026-04-16 18:12:57.014994627 +0000 UTC m=+164.378115907" observedRunningTime="2026-04-16 18:12:57.823717554 +0000 UTC m=+165.186838846" watchObservedRunningTime="2026-04-16 18:12:57.824513924 +0000 UTC m=+165.187635222" Apr 16 18:12:57.838352 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:57.838302 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d6hfh" podStartSLOduration=129.955675529 podStartE2EDuration="2m11.838284997s" podCreationTimestamp="2026-04-16 18:10:46 +0000 UTC" firstStartedPulling="2026-04-16 18:12:55.133724242 +0000 UTC m=+162.496845519" lastFinishedPulling="2026-04-16 18:12:57.016333701 +0000 UTC m=+164.379454987" observedRunningTime="2026-04-16 18:12:57.837863633 +0000 UTC m=+165.200984948" watchObservedRunningTime="2026-04-16 18:12:57.838284997 +0000 UTC m=+165.201406296" Apr 16 18:12:58.817005 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:58.816971 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" podUID="02a0f007-c6a0-4878-9fb4-f22bb0637984" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:12:58.817351 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:58.817040 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" Apr 16 18:12:58.817532 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:58.817499 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"0ff8defbcd2528404f946be7716877e74aa318bd4632dd79f4a5fa2da2fe2d80"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:12:58.817602 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:58.817584 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" podUID="02a0f007-c6a0-4878-9fb4-f22bb0637984" containerName="service-proxy" containerID="cri-o://0ff8defbcd2528404f946be7716877e74aa318bd4632dd79f4a5fa2da2fe2d80" gracePeriod=30 Apr 16 18:12:59.819976 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:59.819943 2574 generic.go:358] "Generic (PLEG): container finished" podID="02a0f007-c6a0-4878-9fb4-f22bb0637984" containerID="0ff8defbcd2528404f946be7716877e74aa318bd4632dd79f4a5fa2da2fe2d80" exitCode=2 Apr 16 18:12:59.820331 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:59.820010 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerDied","Data":"0ff8defbcd2528404f946be7716877e74aa318bd4632dd79f4a5fa2da2fe2d80"} Apr 16 18:12:59.820331 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:12:59.820043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d757d5885-gm9sk" event={"ID":"02a0f007-c6a0-4878-9fb4-f22bb0637984","Type":"ContainerStarted","Data":"6ae9575778abc667fa4fb5f02088247085a6aa0ed2ba8b84b283ce098b1a08a0"} Apr 16 18:13:07.817375 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:13:07.817299 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d6hfh" Apr 16 18:15:13.194237 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:15:13.194207 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:15:13.194759 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:15:13.194207 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:15:13.200505 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:15:13.200483 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:16:27.696012 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.695928 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw"] Apr 16 18:16:27.698914 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.698897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.700997 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.700978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:16:27.701107 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.701097 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:16:27.701596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.701579 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-4998z\"" Apr 16 18:16:27.701596 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.701593 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:16:27.701747 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.701596 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:16:27.701747 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.701668 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:16:27.706241 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.706220 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw"] Apr 16 18:16:27.817229 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.817200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/32f7c604-fd53-422a-a61f-1a048cb43531-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.817381 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.817236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.817381 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.817279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5b9\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-kube-api-access-tf5b9\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.855381 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.855353 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-8mxm4"] Apr 16 18:16:27.858299 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.858284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:27.861004 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.860986 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:16:27.866556 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.866528 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-8mxm4"] Apr 16 18:16:27.917840 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.917811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:27.917966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.917848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/32f7c604-fd53-422a-a61f-1a048cb43531-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.917966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.917867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.917966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.917923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf5b9\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-kube-api-access-tf5b9\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.917966 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.917952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfg6\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-kube-api-access-9nfg6\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:27.918101 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:27.917979 2574 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:16:27.918101 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:27.918003 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:16:27.918101 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:27.918022 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw: references non-existent secret key: tls.crt Apr 16 18:16:27.918101 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:27.918085 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates podName:32f7c604-fd53-422a-a61f-1a048cb43531 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:28.418071302 +0000 UTC m=+375.781192582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates") pod "keda-metrics-apiserver-7c9f485588-qmwqw" (UID: "32f7c604-fd53-422a-a61f-1a048cb43531") : references non-existent secret key: tls.crt Apr 16 18:16:27.918225 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.918212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/32f7c604-fd53-422a-a61f-1a048cb43531-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:27.926462 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:27.926439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf5b9\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-kube-api-access-tf5b9\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:28.018801 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.018727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.018801 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.018796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfg6\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-kube-api-access-9nfg6\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.018980 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.018859 2574 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 18:16:28.018980 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.018883 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-8mxm4: secret "keda-admission-webhooks-certs" not found Apr 16 18:16:28.018980 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.018940 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates podName:b16d9ae1-1e9a-45c5-83fd-e466b65c9d90 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:28.518926397 +0000 UTC m=+375.882047673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates") pod "keda-admission-cf49989db-8mxm4" (UID: "b16d9ae1-1e9a-45c5-83fd-e466b65c9d90") : secret "keda-admission-webhooks-certs" not found Apr 16 18:16:28.026794 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.026769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfg6\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-kube-api-access-9nfg6\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.422497 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.422460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:28.422659 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.422600 2574 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:16:28.422659 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.422614 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:16:28.422659 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.422631 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw: references non-existent secret key: tls.crt Apr 16 18:16:28.422787 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:28.422711 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates podName:32f7c604-fd53-422a-a61f-1a048cb43531 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:29.42267522 +0000 UTC m=+376.785796498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates") pod "keda-metrics-apiserver-7c9f485588-qmwqw" (UID: "32f7c604-fd53-422a-a61f-1a048cb43531") : references non-existent secret key: tls.crt Apr 16 18:16:28.522941 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.522910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.525231 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.525207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b16d9ae1-1e9a-45c5-83fd-e466b65c9d90-certificates\") pod \"keda-admission-cf49989db-8mxm4\" (UID: \"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90\") " pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.769024 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.768930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:28.884945 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.884907 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-8mxm4"] Apr 16 18:16:28.888014 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:16:28.887990 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16d9ae1_1e9a_45c5_83fd_e466b65c9d90.slice/crio-fb6f7a23809ae835019bbc9d791c6e6007386934aad6b7a952e60298229c12f9 WatchSource:0}: Error finding container fb6f7a23809ae835019bbc9d791c6e6007386934aad6b7a952e60298229c12f9: Status 404 returned error can't find the container with id fb6f7a23809ae835019bbc9d791c6e6007386934aad6b7a952e60298229c12f9 Apr 16 18:16:28.889144 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:28.889128 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:29.348554 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:29.348520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-8mxm4" event={"ID":"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90","Type":"ContainerStarted","Data":"fb6f7a23809ae835019bbc9d791c6e6007386934aad6b7a952e60298229c12f9"} Apr 16 18:16:29.430084 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:29.430033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:29.430249 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:29.430210 2574 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:16:29.430249 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:29.430233 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:16:29.430369 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:29.430257 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw: references non-existent secret key: tls.crt Apr 16 18:16:29.430369 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:16:29.430324 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates podName:32f7c604-fd53-422a-a61f-1a048cb43531 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:31.430304632 +0000 UTC m=+378.793425923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates") pod "keda-metrics-apiserver-7c9f485588-qmwqw" (UID: "32f7c604-fd53-422a-a61f-1a048cb43531") : references non-existent secret key: tls.crt Apr 16 18:16:31.354339 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.354306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-8mxm4" event={"ID":"b16d9ae1-1e9a-45c5-83fd-e466b65c9d90","Type":"ContainerStarted","Data":"2ae9d32512c7b2c3b3796b938fa51563f8a2b7b161f34e6f733d26e37263ee43"} Apr 16 18:16:31.354725 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.354427 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:16:31.369825 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.369780 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-8mxm4" podStartSLOduration=2.240026637 podStartE2EDuration="4.369767891s" podCreationTimestamp="2026-04-16 18:16:27 +0000 UTC" firstStartedPulling="2026-04-16 18:16:28.889249404 +0000 UTC m=+376.252370680" lastFinishedPulling="2026-04-16 18:16:31.018990647 +0000 UTC m=+378.382111934" observedRunningTime="2026-04-16 18:16:31.368233811 +0000 UTC m=+378.731355108" watchObservedRunningTime="2026-04-16 18:16:31.369767891 +0000 UTC m=+378.732889189" Apr 16 18:16:31.446897 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.446861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:31.449374 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.449351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/32f7c604-fd53-422a-a61f-1a048cb43531-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qmwqw\" (UID: \"32f7c604-fd53-422a-a61f-1a048cb43531\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:31.609151 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.609078 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:31.720575 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:31.720541 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw"] Apr 16 18:16:31.723562 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:16:31.723538 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f7c604_fd53_422a_a61f_1a048cb43531.slice/crio-d2de210058dba3066d8f853c60db501dff702ad893725f71bdc34390a7f098be WatchSource:0}: Error finding container d2de210058dba3066d8f853c60db501dff702ad893725f71bdc34390a7f098be: Status 404 returned error can't find the container with id d2de210058dba3066d8f853c60db501dff702ad893725f71bdc34390a7f098be Apr 16 18:16:32.357553 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:32.357507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" event={"ID":"32f7c604-fd53-422a-a61f-1a048cb43531","Type":"ContainerStarted","Data":"d2de210058dba3066d8f853c60db501dff702ad893725f71bdc34390a7f098be"} Apr 16 18:16:35.365500 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:35.365468 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" event={"ID":"32f7c604-fd53-422a-a61f-1a048cb43531","Type":"ContainerStarted","Data":"9eb3d27c408173fd4d515dda49eb6535d20c42467f06a120c1f1761b4de51aed"} Apr 16 18:16:35.365900 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:35.365670 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:35.379762 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:35.379719 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" podStartSLOduration=5.794830469 podStartE2EDuration="8.379690504s" podCreationTimestamp="2026-04-16 18:16:27 +0000 UTC" firstStartedPulling="2026-04-16 18:16:31.725287905 +0000 UTC m=+379.088409181" lastFinishedPulling="2026-04-16 18:16:34.310147936 +0000 UTC m=+381.673269216" observedRunningTime="2026-04-16 18:16:35.379034844 +0000 UTC m=+382.742156163" watchObservedRunningTime="2026-04-16 18:16:35.379690504 +0000 UTC m=+382.742811801" Apr 16 18:16:46.372904 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:46.372877 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qmwqw" Apr 16 18:16:52.360330 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:16:52.360294 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-8mxm4" Apr 16 18:17:33.704953 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.704869 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w87nh"] Apr 16 18:17:33.707864 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.707838 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:33.710885 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.710861 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-b578k\"" Apr 16 18:17:33.711000 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.710900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:17:33.711000 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.710976 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:17:33.711358 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.711343 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:17:33.716754 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.716735 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w87nh"] Apr 16 18:17:33.771481 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.771443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n7c5\" (UniqueName: \"kubernetes.io/projected/904f3b69-7352-4923-9274-ef46c599c24a-kube-api-access-8n7c5\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:33.771644 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.771510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:33.872016 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.871980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n7c5\" (UniqueName: \"kubernetes.io/projected/904f3b69-7352-4923-9274-ef46c599c24a-kube-api-access-8n7c5\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:33.872189 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.872027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:33.872189 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:17:33.872119 2574 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 18:17:33.872189 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:17:33.872183 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert podName:904f3b69-7352-4923-9274-ef46c599c24a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:34.372167217 +0000 UTC m=+441.735288493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert") pod "llmisvc-controller-manager-68cc5db7c4-w87nh" (UID: "904f3b69-7352-4923-9274-ef46c599c24a") : secret "llmisvc-webhook-server-cert" not found Apr 16 18:17:33.879911 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:33.879889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n7c5\" (UniqueName: \"kubernetes.io/projected/904f3b69-7352-4923-9274-ef46c599c24a-kube-api-access-8n7c5\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:34.376288 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:34.376249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:34.378573 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:34.378555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904f3b69-7352-4923-9274-ef46c599c24a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w87nh\" (UID: \"904f3b69-7352-4923-9274-ef46c599c24a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:34.618195 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:34.618151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:34.735266 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:34.735238 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w87nh"] Apr 16 18:17:34.738421 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:17:34.738392 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod904f3b69_7352_4923_9274_ef46c599c24a.slice/crio-0419c24f4cddbc4c88cc8beba4a9e88fc304cbc6d9029060c69704660a2ebd7f WatchSource:0}: Error finding container 0419c24f4cddbc4c88cc8beba4a9e88fc304cbc6d9029060c69704660a2ebd7f: Status 404 returned error can't find the container with id 0419c24f4cddbc4c88cc8beba4a9e88fc304cbc6d9029060c69704660a2ebd7f Apr 16 18:17:35.522030 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:35.521987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" event={"ID":"904f3b69-7352-4923-9274-ef46c599c24a","Type":"ContainerStarted","Data":"0419c24f4cddbc4c88cc8beba4a9e88fc304cbc6d9029060c69704660a2ebd7f"} Apr 16 18:17:37.528225 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:37.528191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" event={"ID":"904f3b69-7352-4923-9274-ef46c599c24a","Type":"ContainerStarted","Data":"c2661068a51f3265d944993986f5379fed7800700d63bced0716790a2175032c"} Apr 16 18:17:37.528608 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:37.528298 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:17:37.542196 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:17:37.542162 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" podStartSLOduration=2.739543298 podStartE2EDuration="4.542150096s" podCreationTimestamp="2026-04-16 18:17:33 +0000 UTC" firstStartedPulling="2026-04-16 18:17:34.739572041 +0000 UTC m=+442.102693317" lastFinishedPulling="2026-04-16 18:17:36.54217884 +0000 UTC m=+443.905300115" observedRunningTime="2026-04-16 18:17:37.541513785 +0000 UTC m=+444.904635102" watchObservedRunningTime="2026-04-16 18:17:37.542150096 +0000 UTC m=+444.905271396" Apr 16 18:18:08.533723 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:18:08.533677 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w87nh" Apr 16 18:20:13.212970 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:20:13.212941 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:20:13.214419 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:20:13.214398 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:22:47.558196 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.558163 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:22:47.561233 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.561211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:22:47.563029 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.563008 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ww52g\"" Apr 16 18:22:47.566201 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.566181 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:22:47.572120 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.572105 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:22:47.685569 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.685547 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:22:47.687953 ip-10-0-143-48 kubenswrapper[2574]: W0416 18:22:47.687924 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1657f577_bc69_4915_99a9_d08d76d99874.slice/crio-15410557ce28dd49867b46842e34ef3ea722bada20bc8b9fe4d3e679af0b63b8 WatchSource:0}: Error finding container 15410557ce28dd49867b46842e34ef3ea722bada20bc8b9fe4d3e679af0b63b8: Status 404 returned error can't find the container with id 15410557ce28dd49867b46842e34ef3ea722bada20bc8b9fe4d3e679af0b63b8 Apr 16 18:22:47.689757 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:47.689735 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:22:48.320314 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:48.320278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" event={"ID":"1657f577-bc69-4915-99a9-d08d76d99874","Type":"ContainerStarted","Data":"15410557ce28dd49867b46842e34ef3ea722bada20bc8b9fe4d3e679af0b63b8"} Apr 16 18:22:49.324018 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:49.323984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" event={"ID":"1657f577-bc69-4915-99a9-d08d76d99874","Type":"ContainerStarted","Data":"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1"} Apr 16 18:22:49.324416 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:49.324251 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:22:49.325864 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:49.325839 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:22:49.337748 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:22:49.337643 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" podStartSLOduration=0.946934278 podStartE2EDuration="2.337629479s" podCreationTimestamp="2026-04-16 18:22:47 +0000 UTC" firstStartedPulling="2026-04-16 18:22:47.689875705 +0000 UTC m=+755.052996980" lastFinishedPulling="2026-04-16 18:22:49.080570901 +0000 UTC m=+756.443692181" observedRunningTime="2026-04-16 18:22:49.336685697 +0000 UTC m=+756.699806989" watchObservedRunningTime="2026-04-16 18:22:49.337629479 +0000 UTC m=+756.700750776" Apr 16 18:24:22.651386 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:22.651348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-s4pqq_1657f577-bc69-4915-99a9-d08d76d99874/kserve-container/0.log" Apr 16 18:24:22.931200 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:22.931120 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:24:22.931393 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:22.931364 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" podUID="1657f577-bc69-4915-99a9-d08d76d99874" containerName="kserve-container" containerID="cri-o://968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1" gracePeriod=30 Apr 16 18:24:23.162650 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.162631 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:24:23.559126 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.559091 2574 generic.go:358] "Generic (PLEG): container finished" podID="1657f577-bc69-4915-99a9-d08d76d99874" containerID="968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1" exitCode=2 Apr 16 18:24:23.559291 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.559154 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" Apr 16 18:24:23.559291 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.559177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" event={"ID":"1657f577-bc69-4915-99a9-d08d76d99874","Type":"ContainerDied","Data":"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1"} Apr 16 18:24:23.559291 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.559212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq" event={"ID":"1657f577-bc69-4915-99a9-d08d76d99874","Type":"ContainerDied","Data":"15410557ce28dd49867b46842e34ef3ea722bada20bc8b9fe4d3e679af0b63b8"} Apr 16 18:24:23.559291 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.559233 2574 scope.go:117] "RemoveContainer" containerID="968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1" Apr 16 18:24:23.569660 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.569635 2574 scope.go:117] "RemoveContainer" containerID="968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1" Apr 16 18:24:23.570002 ip-10-0-143-48 kubenswrapper[2574]: E0416 18:24:23.569964 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1\": container with ID starting with 968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1 not found: ID does not exist" containerID="968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1" Apr 16 18:24:23.570089 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.570014 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1"} err="failed to get container status \"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1\": rpc error: code = NotFound desc = could not find container \"968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1\": container with ID starting with 968c7685878cdc894f8ceed4fde5b5e8a8c99d932d969549c7c007bd7cc4f0a1 not found: ID does not exist" Apr 16 18:24:23.578144 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.578125 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:24:23.580875 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:23.580854 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-s4pqq"] Apr 16 18:24:25.336812 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:24:25.336782 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1657f577-bc69-4915-99a9-d08d76d99874" path="/var/lib/kubelet/pods/1657f577-bc69-4915-99a9-d08d76d99874/volumes" Apr 16 18:25:13.230085 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:25:13.230008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:25:13.231100 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:25:13.231077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:30:13.248369 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:30:13.248339 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:30:13.249854 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:30:13.249833 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:35:13.263929 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:35:13.263900 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:35:13.266342 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:35:13.266317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:40:13.280659 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:40:13.280626 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:40:13.282448 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:40:13.282426 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:45:13.296276 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:45:13.296249 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:45:13.298464 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:45:13.298442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:50:13.312162 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:50:13.312126 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:50:13.317443 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:50:13.317421 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:55:13.331570 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:55:13.331491 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 18:55:13.335491 ip-10-0-143-48 kubenswrapper[2574]: I0416 18:55:13.335464 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:00:13.347641 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:00:13.347536 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:00:13.352021 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:00:13.352005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:05:13.364170 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:05:13.364055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:05:13.368003 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:05:13.367987 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:10:13.380048 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:10:13.379926 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:10:13.386834 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:10:13.384584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:14:07.129725 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:07.129680 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-56w6g_4f7d4bee-e467-46d2-b0cf-bba98e8cd041/global-pull-secret-syncer/0.log" Apr 16 19:14:07.337580 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:07.337548 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m6xb7_05edfe7d-7845-4f05-a320-9d359462ba01/konnectivity-agent/0.log" Apr 16 19:14:07.409292 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:07.409215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-48.ec2.internal_022134da6e351443de7e25a90a6de7a4/haproxy/0.log" Apr 16 19:14:11.293907 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:11.293877 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-74lpz_c7c69288-1997-4e93-9019-58347e21e6df/node-exporter/0.log" Apr 16 19:14:11.313801 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:11.313776 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-74lpz_c7c69288-1997-4e93-9019-58347e21e6df/kube-rbac-proxy/0.log" Apr 16 19:14:11.335306 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:11.335284 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-74lpz_c7c69288-1997-4e93-9019-58347e21e6df/init-textfile/0.log" Apr 16 19:14:14.084425 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.084391 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk"] Apr 16 19:14:14.084813 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.084709 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1657f577-bc69-4915-99a9-d08d76d99874" containerName="kserve-container" Apr 16 19:14:14.084813 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.084736 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1657f577-bc69-4915-99a9-d08d76d99874" containerName="kserve-container" Apr 16 19:14:14.084813 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.084790 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1657f577-bc69-4915-99a9-d08d76d99874" containerName="kserve-container" Apr 16 19:14:14.087587 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.087569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.090362 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.090338 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p86h9\"/\"default-dockercfg-j49c7\"" Apr 16 19:14:14.090800 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.090786 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"kube-root-ca.crt\"" Apr 16 19:14:14.091312 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.091294 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p86h9\"/\"openshift-service-ca.crt\"" Apr 16 19:14:14.096375 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.096354 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk"] Apr 16 19:14:14.174607 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.174571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-sys\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.174781 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.174617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-podres\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.174781 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.174642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-proc\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.174781 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.174719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-lib-modules\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.174781 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.174757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqpxm\" (UniqueName: \"kubernetes.io/projected/30d6569f-837b-4b78-aade-f4a30e293c6a-kube-api-access-rqpxm\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.275905 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.275869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-lib-modules\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276053 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.275909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqpxm\" (UniqueName: \"kubernetes.io/projected/30d6569f-837b-4b78-aade-f4a30e293c6a-kube-api-access-rqpxm\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276053 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.275945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-sys\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276053 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.275985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-podres\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276053 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.276014 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-sys\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276053 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.276033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-lib-modules\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276226 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.276022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-proc\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276226 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.276063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-proc\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.276226 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.276106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30d6569f-837b-4b78-aade-f4a30e293c6a-podres\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.283105 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.283085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqpxm\" (UniqueName: \"kubernetes.io/projected/30d6569f-837b-4b78-aade-f4a30e293c6a-kube-api-access-rqpxm\") pod \"perf-node-gather-daemonset-fnhwk\" (UID: \"30d6569f-837b-4b78-aade-f4a30e293c6a\") " pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.398372 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.398300 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:14.512640 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.512604 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk"] Apr 16 19:14:14.515406 ip-10-0-143-48 kubenswrapper[2574]: W0416 19:14:14.515368 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30d6569f_837b_4b78_aade_f4a30e293c6a.slice/crio-978dc4a5e0f892b96c4f2d2448d5078ef7cb21959b7277d14ad67c903ce8fb69 WatchSource:0}: Error finding container 978dc4a5e0f892b96c4f2d2448d5078ef7cb21959b7277d14ad67c903ce8fb69: Status 404 returned error can't find the container with id 978dc4a5e0f892b96c4f2d2448d5078ef7cb21959b7277d14ad67c903ce8fb69 Apr 16 19:14:14.517021 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.516997 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:14:14.981549 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:14.981522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d6hfh_0d86e7f9-1a3a-4778-a936-753a6e1ee886/dns/0.log" Apr 16 19:14:15.001046 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.001025 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d6hfh_0d86e7f9-1a3a-4778-a936-753a6e1ee886/kube-rbac-proxy/0.log" Apr 16 19:14:15.111788 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.111756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sqdqp_5806a3c4-be78-414a-9251-725dc0b94d51/dns-node-resolver/0.log" Apr 16 19:14:15.226737 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.226691 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" event={"ID":"30d6569f-837b-4b78-aade-f4a30e293c6a","Type":"ContainerStarted","Data":"6419ab389c6d59cfdc46fa4bc115ae0b35b3da72634d33318985095aa5ce220c"} Apr 16 19:14:15.226737 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.226739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" event={"ID":"30d6569f-837b-4b78-aade-f4a30e293c6a","Type":"ContainerStarted","Data":"978dc4a5e0f892b96c4f2d2448d5078ef7cb21959b7277d14ad67c903ce8fb69"} Apr 16 19:14:15.226937 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.226769 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:15.241637 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.241557 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" podStartSLOduration=1.2415435160000001 podStartE2EDuration="1.241543516s" podCreationTimestamp="2026-04-16 19:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:14:15.239879695 +0000 UTC m=+3842.603000993" watchObservedRunningTime="2026-04-16 19:14:15.241543516 +0000 UTC m=+3842.604664854" Apr 16 19:14:15.526994 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:15.526921 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4dfv8_a22f332b-dce3-4709-8d3e-a6ca1b00bc4a/node-ca/0.log" Apr 16 19:14:16.581493 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:16.581469 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-thnb8_92d4b593-ee95-460f-9517-0583cadeaeb1/serve-healthcheck-canary/0.log" Apr 16 19:14:16.989313 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:16.989240 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48x9c_15d0a19e-ca6a-4564-bc4c-88b2d00b2e25/kube-rbac-proxy/0.log" Apr 16 19:14:17.023521 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:17.023499 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48x9c_15d0a19e-ca6a-4564-bc4c-88b2d00b2e25/exporter/0.log" Apr 16 19:14:17.045302 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:17.045280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-48x9c_15d0a19e-ca6a-4564-bc4c-88b2d00b2e25/extractor/0.log" Apr 16 19:14:19.119833 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:19.119804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-w87nh_904f3b69-7352-4923-9274-ef46c599c24a/manager/0.log" Apr 16 19:14:21.238397 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:21.238369 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p86h9/perf-node-gather-daemonset-fnhwk" Apr 16 19:14:24.500234 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.500201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/kube-multus-additional-cni-plugins/0.log" Apr 16 19:14:24.521380 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.521361 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/egress-router-binary-copy/0.log" Apr 16 19:14:24.543227 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.543200 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/cni-plugins/0.log" Apr 16 19:14:24.563909 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.563887 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/bond-cni-plugin/0.log" Apr 16 19:14:24.583349 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.583327 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/routeoverride-cni/0.log" Apr 16 19:14:24.603342 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.603321 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/whereabouts-cni-bincopy/0.log" Apr 16 19:14:24.623338 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:24.623320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dg7bs_b00af592-6ad1-4cc6-8dc6-0b46ced5a45c/whereabouts-cni/0.log" Apr 16 19:14:25.020976 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:25.020946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jp8j9_d01511ca-4447-40f9-8518-9d2f62898c7a/kube-multus/0.log" Apr 16 19:14:25.100185 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:25.100160 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7gnqt_95b4efe3-7d49-40fc-a8e3-4381e92ed949/network-metrics-daemon/0.log" Apr 16 19:14:25.121908 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:25.121885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7gnqt_95b4efe3-7d49-40fc-a8e3-4381e92ed949/kube-rbac-proxy/0.log" Apr 16 19:14:26.489829 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.489804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-controller/0.log" Apr 16 19:14:26.508391 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.508371 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/0.log" Apr 16 19:14:26.524288 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.524263 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovn-acl-logging/1.log" Apr 16 19:14:26.542476 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.542456 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/kube-rbac-proxy-node/0.log" Apr 16 19:14:26.568904 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.568876 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:14:26.588183 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.588166 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/northd/0.log" Apr 16 19:14:26.607246 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.607224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/nbdb/0.log" Apr 16 19:14:26.626040 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.626020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/sbdb/0.log" Apr 16 19:14:26.724494 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:26.724468 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbm6k_4454b177-644b-4125-8259-d9aeaf036cf6/ovnkube-controller/0.log" Apr 16 19:14:27.691408 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:27.691382 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-hkxvh_7c5a1963-90d7-47a3-9562-1945dd539cd4/check-endpoints/0.log" Apr 16 19:14:27.713287 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:27.713266 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9jtfh_ae61ed6e-f711-4ff6-a33f-c2ff79830f57/network-check-target-container/0.log" Apr 16 19:14:28.663767 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:28.663739 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-t6q6l_9a3d42b4-5542-4575-a00e-81713e0d9ced/iptables-alerter/0.log" Apr 16 19:14:29.268203 ip-10-0-143-48 kubenswrapper[2574]: I0416 19:14:29.268175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8tjw6_a31dbcfe-5679-4c94-9030-4e1442f23cf0/tuned/0.log"