Apr 20 16:23:11.653046 ip-10-0-142-44 systemd[1]: Starting Kubernetes Kubelet... Apr 20 16:23:12.105420 ip-10-0-142-44 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:12.105420 ip-10-0-142-44 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 16:23:12.105420 ip-10-0-142-44 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:12.105420 ip-10-0-142-44 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 16:23:12.105420 ip-10-0-142-44 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 16:23:12.106408 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.106316 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 16:23:12.108645 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108623 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:12.108645 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108641 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:12.108645 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108648 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:12.108645 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108652 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108656 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108661 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108665 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108670 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108674 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108677 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108682 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108686 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108690 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108694 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108698 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108703 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108714 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108719 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108724 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108729 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108736 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108741 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:12.108917 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108745 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108749 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108753 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108772 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108777 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108781 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108785 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108789 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108793 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108799 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108803 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108807 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108811 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108816 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108821 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108826 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108838 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108844 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108848 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108853 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:12.109676 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108858 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108863 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108867 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108870 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108874 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108879 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108883 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108887 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108892 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108896 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108900 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108904 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108908 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108913 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108917 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108921 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108925 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108929 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108933 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108938 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:12.110449 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108944 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108949 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108952 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108957 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108961 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108965 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108969 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108975 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108979 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108983 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108988 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108992 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.108997 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109001 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109005 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109009 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109013 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109018 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109021 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:12.110954 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109029 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109034 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109039 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109043 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109048 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109688 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109696 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109701 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109706 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109710 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109714 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109718 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109723 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109727 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109731 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109736 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109740 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109744 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109749 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:12.111703 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109772 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109777 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109781 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109787 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109792 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109796 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109800 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109804 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109808 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109812 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109816 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109820 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109824 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109829 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109834 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109838 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109843 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109847 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109852 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109856 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:12.112521 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109860 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109865 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109869 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109873 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109878 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109883 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109887 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109891 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109895 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109900 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109904 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109908 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109914 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109918 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109922 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109926 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109932 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109940 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109946 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:12.113055 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109953 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109960 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109964 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109969 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109974 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109978 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109982 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109986 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109990 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109994 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.109998 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110005 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110009 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110013 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110016 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110021 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110025 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110029 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110033 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110037 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:12.113637 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110041 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110045 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110049 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110054 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110059 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110063 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110067 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110071 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110076 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110080 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110085 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110089 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.110093 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111041 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111064 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111077 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111084 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111091 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111097 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111104 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111112 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 16:23:12.114277 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111118 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111123 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111129 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111135 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111140 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111145 2576 flags.go:64] FLAG: --cgroup-root="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111150 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111155 2576 flags.go:64] FLAG: --client-ca-file="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111160 2576 flags.go:64] FLAG: --cloud-config="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111165 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111170 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111176 2576 flags.go:64] FLAG: --cluster-domain="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111180 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111186 2576 flags.go:64] FLAG: --config-dir="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111190 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111196 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111203 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111208 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111213 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111219 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111224 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111230 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111235 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111243 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111248 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 16:23:12.114923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111255 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111260 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111265 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111270 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111275 2576 flags.go:64] FLAG: --enable-server="true" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111280 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111287 2576 flags.go:64] FLAG: --event-burst="100" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111293 2576 flags.go:64] FLAG: --event-qps="50" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111297 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111303 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111308 2576 flags.go:64] FLAG: --eviction-hard="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111314 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111319 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111324 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111329 2576 flags.go:64] FLAG: --eviction-soft="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111334 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111339 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111344 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111349 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111354 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111358 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111363 2576 flags.go:64] FLAG: --feature-gates="" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111370 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111375 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111380 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 16:23:12.115584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111385 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111390 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111395 2576 flags.go:64] FLAG: --help="false" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111401 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111406 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111411 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111417 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111423 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111429 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111434 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111439 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111443 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111448 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111453 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111458 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111463 2576 flags.go:64] FLAG: --kube-reserved="" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111468 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111473 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111478 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111483 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111488 2576 flags.go:64] FLAG: --lock-file="" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111493 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111498 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111503 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 16:23:12.116257 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111512 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111517 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111522 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111527 2576 flags.go:64] FLAG: --logging-format="text" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111531 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111537 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111542 2576 flags.go:64] FLAG: --manifest-url="" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111547 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111554 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111559 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111569 2576 flags.go:64] FLAG: --max-pods="110" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111574 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111579 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111584 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111588 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111598 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111603 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111608 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111621 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111625 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111630 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111635 2576 flags.go:64] FLAG: --pod-cidr="" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111640 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 16:23:12.116876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111650 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111654 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111659 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111663 2576 flags.go:64] FLAG: --port="10250" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111669 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111674 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e5bfa02b98bcce16" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111679 2576 flags.go:64] FLAG: --qos-reserved="" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111684 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111689 2576 flags.go:64] FLAG: --register-node="true" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111693 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111698 2576 flags.go:64] FLAG: --register-with-taints="" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111704 2576 flags.go:64] FLAG: --registry-burst="10" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111710 2576 flags.go:64] FLAG: --registry-qps="5" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111714 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111719 2576 flags.go:64] FLAG: --reserved-memory="" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111725 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111730 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111735 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111740 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111747 2576 flags.go:64] FLAG: --runonce="false" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111752 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111771 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111778 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111782 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111787 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111793 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 16:23:12.117428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111798 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111804 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111808 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111813 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111818 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111823 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111828 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111832 2576 flags.go:64] FLAG: --system-cgroups="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111837 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111846 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111852 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111856 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111862 2576 flags.go:64] FLAG: --tls-min-version="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111867 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111872 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111876 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111881 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111887 2576 flags.go:64] FLAG: --v="2" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111894 2576 flags.go:64] FLAG: --version="false" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111901 2576 flags.go:64] FLAG: --vmodule="" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111908 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.111913 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112068 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112075 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:12.118159 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112080 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112087 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112092 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112098 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112102 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112106 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112110 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112115 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112119 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112124 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112128 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112132 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112136 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112140 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112144 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112148 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112153 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112157 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112161 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112165 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:12.118780 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112169 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112173 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112177 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112184 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112190 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112194 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112198 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112202 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112206 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112211 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112215 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112219 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112223 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112229 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112233 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112237 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112241 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112245 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112250 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112254 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:12.119328 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112258 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112263 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112267 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112271 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112275 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112279 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112283 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112287 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112292 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112296 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112300 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112304 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112309 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112313 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112317 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112321 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112325 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112329 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112333 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112337 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:12.119840 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112341 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112345 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112349 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112353 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112357 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112363 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112368 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112372 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112376 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112380 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112384 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112389 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112393 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112398 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112402 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112406 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112410 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112415 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112422 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:12.120338 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112428 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112433 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112437 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112442 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.112446 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.113335 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.119889 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.120004 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120053 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120058 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120062 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120066 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120069 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120072 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120074 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120077 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:12.120834 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120079 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120082 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120085 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120087 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120090 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120092 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120095 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120098 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120101 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120104 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120106 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120109 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120112 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120115 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120117 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120120 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120122 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120125 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120127 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120130 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:12.121224 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120132 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120135 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120137 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120142 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120144 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120147 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120149 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120152 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120155 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120157 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120159 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120162 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120165 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120167 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120169 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120172 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120174 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120177 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120180 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120182 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:12.121726 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120185 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120187 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120190 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120193 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120198 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120203 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120206 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120209 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120212 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120215 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120217 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120220 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120222 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120224 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120227 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120231 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120235 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120238 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120241 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:12.122240 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120244 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120246 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120249 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120251 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120254 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120256 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120259 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120261 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120264 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120266 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120269 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120271 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120274 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120276 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120279 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120281 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120284 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120286 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:12.122712 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120289 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.120294 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120421 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120427 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120430 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120433 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120436 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120439 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120442 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120444 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120447 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120450 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120453 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120456 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120460 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120462 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 16:23:12.123205 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120465 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120467 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120470 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120472 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120475 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120477 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120480 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120483 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120485 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120488 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120490 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120492 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120495 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120498 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120500 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120503 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120505 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120507 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120510 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120512 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 16:23:12.123601 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120515 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120517 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120520 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120524 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120528 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120531 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120534 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120537 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120539 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120542 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120545 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120547 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120550 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120552 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120555 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120558 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120560 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120563 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120565 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 16:23:12.124116 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120568 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120570 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120573 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120575 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120578 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120580 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120583 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120585 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120588 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120591 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120593 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120596 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120598 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120600 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120603 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120605 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120608 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120610 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120613 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120616 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 16:23:12.124580 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120618 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120621 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120623 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120626 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120629 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120631 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120634 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120636 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120639 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120641 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120645 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120649 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:12.120652 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.120657 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 16:23:12.125183 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.120803 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 16:23:12.125529 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.124901 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 16:23:12.126035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.126023 2576 server.go:1019] "Starting client certificate rotation" Apr 20 16:23:12.126148 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.126130 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:12.126184 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.126170 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 16:23:12.153687 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.153664 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:12.155520 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.155498 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 16:23:12.176024 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.176001 2576 log.go:25] "Validated CRI v1 runtime API" Apr 20 16:23:12.181881 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.181865 2576 log.go:25] "Validated CRI v1 image API" Apr 20 16:23:12.183143 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.183119 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 16:23:12.186124 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.186098 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:12.186462 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.186442 2576 fs.go:135] Filesystem UUIDs: map[43ab50d1-f267-443f-9b05-08b4ca640305:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8d05cc42-9318-40ac-97f3-59d6ceb8f561:/dev/nvme0n1p3] Apr 20 16:23:12.186502 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.186463 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 16:23:12.191875 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.191742 2576 manager.go:217] Machine: {Timestamp:2026-04-20 16:23:12.190576977 +0000 UTC m=+0.416784910 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104077 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b3febc2b478a48a0f213822638bea SystemUUID:ec2b3feb-c2b4-78a4-8a0f-213822638bea BootID:137f1379-3137-4d51-abb1-32cc9077ea99 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:22:28:3c:f3:dd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:22:28:3c:f3:dd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:be:00:bc:1d:75 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 16:23:12.191875 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.191870 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 16:23:12.192012 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192000 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 16:23:12.192343 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192320 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 16:23:12.192478 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192345 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-44.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 16:23:12.192523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192491 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 16:23:12.192523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192499 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 16:23:12.192523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.192516 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:12.193290 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.193279 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 16:23:12.194670 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.194660 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:12.194824 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.194814 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 16:23:12.197241 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.197231 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 20 16:23:12.197288 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.197248 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 16:23:12.197288 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.197263 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 16:23:12.197288 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.197272 2576 kubelet.go:397] "Adding apiserver pod source" Apr 20 16:23:12.197288 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.197281 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 16:23:12.198433 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.198421 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:12.198478 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.198440 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 16:23:12.201805 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.201783 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 16:23:12.203163 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.203150 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 16:23:12.205279 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205254 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205285 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205296 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205305 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205321 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205334 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205343 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205352 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205363 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205372 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205384 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 16:23:12.205398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.205398 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 16:23:12.206348 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.206322 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 16:23:12.206445 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.206348 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 16:23:12.210358 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.210340 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 16:23:12.210471 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.210380 2576 server.go:1295] "Started kubelet" Apr 20 16:23:12.210544 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.210494 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 16:23:12.210579 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.210572 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 16:23:12.210901 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.210876 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 16:23:12.211363 ip-10-0-142-44 systemd[1]: Started Kubernetes Kubelet. Apr 20 16:23:12.211665 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.211641 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 16:23:12.211717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.211688 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 16:23:12.211857 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.211837 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-44.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 16:23:12.213934 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.213919 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 20 16:23:12.214916 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.214899 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 16:23:12.216955 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.216006 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-44.ec2.internal.18a81d3afe5b253c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-44.ec2.internal,UID:ip-10-0-142-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-44.ec2.internal,},FirstTimestamp:2026-04-20 16:23:12.210355516 +0000 UTC m=+0.436563449,LastTimestamp:2026-04-20 16:23:12.210355516 +0000 UTC m=+0.436563449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-44.ec2.internal,}" Apr 20 16:23:12.218942 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.218923 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:12.219260 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.219241 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 16:23:12.219466 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.219453 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 16:23:12.220059 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220043 2576 factory.go:55] Registering systemd factory Apr 20 16:23:12.220145 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220094 2576 factory.go:223] Registration of the systemd container factory successfully Apr 20 16:23:12.220145 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220096 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 16:23:12.220145 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220114 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 16:23:12.220145 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220093 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 16:23:12.220310 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220184 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 20 16:23:12.220310 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220192 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 20 16:23:12.220310 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220308 2576 factory.go:153] Registering CRI-O factory Apr 20 16:23:12.220439 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220321 2576 factory.go:223] Registration of the crio container factory successfully Apr 20 16:23:12.220439 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220365 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 16:23:12.220439 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220392 2576 factory.go:103] Registering Raw factory Apr 20 16:23:12.220439 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220409 2576 manager.go:1196] Started watching for new ooms in manager Apr 20 16:23:12.220603 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.220493 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.220850 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.220836 2576 manager.go:319] Starting recovery of all containers Apr 20 16:23:12.225473 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.225440 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 16:23:12.225602 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.225576 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 16:23:12.233932 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.233913 2576 manager.go:324] Recovery completed Apr 20 16:23:12.237643 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.237617 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4zrnz" Apr 20 16:23:12.238066 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.238055 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.240463 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.240444 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.240535 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.240476 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.240535 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.240488 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.241072 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.241056 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 16:23:12.241072 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.241070 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 16:23:12.241166 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.241088 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 16:23:12.242675 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.242602 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-44.ec2.internal.18a81d3b00268cbc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-44.ec2.internal,UID:ip-10-0-142-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-44.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-44.ec2.internal,},FirstTimestamp:2026-04-20 16:23:12.240463036 +0000 UTC m=+0.466670969,LastTimestamp:2026-04-20 16:23:12.240463036 +0000 UTC m=+0.466670969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-44.ec2.internal,}" Apr 20 16:23:12.243231 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.243209 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4zrnz" Apr 20 16:23:12.243303 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.243297 2576 policy_none.go:49] "None policy: Start" Apr 20 16:23:12.243348 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.243321 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 16:23:12.243348 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.243333 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 20 16:23:12.288515 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.288496 2576 manager.go:341] "Starting Device Plugin manager" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.288594 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.288608 2576 server.go:85] "Starting device plugin registration server" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.288918 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.288933 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.289020 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.289133 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.289140 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.289887 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 16:23:12.292978 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.289928 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.356616 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.356524 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 16:23:12.357803 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.357784 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 16:23:12.357862 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.357815 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 16:23:12.357862 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.357836 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 16:23:12.357862 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.357846 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 16:23:12.357988 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.357943 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 16:23:12.359589 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.359568 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:12.389676 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.389642 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.390583 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.390564 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.390688 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.390603 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.390688 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.390619 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.390688 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.390650 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.398570 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.398551 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.398638 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.398578 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-44.ec2.internal\": node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.414391 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.414360 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.458941 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.458906 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal"] Apr 20 16:23:12.459035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.459007 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.460382 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.460365 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.460443 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.460396 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.460443 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.460410 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.461570 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.461557 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.461709 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.461693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.461775 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.461724 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.468444 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468426 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.468531 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468454 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.468531 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468464 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.468531 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468425 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.468531 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468530 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.468643 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.468542 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.470022 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.470007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.470088 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.470035 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 16:23:12.470780 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.470754 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientMemory" Apr 20 16:23:12.470861 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.470798 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 16:23:12.470861 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.470813 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeHasSufficientPID" Apr 20 16:23:12.500534 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.500513 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-44.ec2.internal\" not found" node="ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.504771 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.504742 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-44.ec2.internal\" not found" node="ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.514965 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.514937 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.522545 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.522517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.522650 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.522553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.522650 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.522570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fe42efe7b737ea774d310634568d2b9-config\") pod \"kube-apiserver-proxy-ip-10-0-142-44.ec2.internal\" (UID: \"1fe42efe7b737ea774d310634568d2b9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.615922 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.615846 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.623308 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.623376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.623376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fe42efe7b737ea774d310634568d2b9-config\") pod \"kube-apiserver-proxy-ip-10-0-142-44.ec2.internal\" (UID: \"1fe42efe7b737ea774d310634568d2b9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.623376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.623376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baf1f649d76438d5a7a5af68df1c18c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal\" (UID: \"baf1f649d76438d5a7a5af68df1c18c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.623497 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.623391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fe42efe7b737ea774d310634568d2b9-config\") pod \"kube-apiserver-proxy-ip-10-0-142-44.ec2.internal\" (UID: \"1fe42efe7b737ea774d310634568d2b9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.716736 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.716697 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.802322 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.802291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.807997 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:12.807972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:12.817613 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.817587 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:12.918187 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:12.918088 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.018637 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.018596 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.119181 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.119140 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.125441 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.125412 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 16:23:13.125579 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.125562 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 16:23:13.219376 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.219296 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.219376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.219329 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 16:23:13.229883 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.229862 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 16:23:13.245662 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.245633 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 16:18:12 +0000 UTC" deadline="2027-11-25 15:42:04.804696697 +0000 UTC" Apr 20 16:23:13.245662 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.245661 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14015h18m51.559039031s" Apr 20 16:23:13.247240 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.247223 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lms4p" Apr 20 16:23:13.255816 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.255795 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lms4p" Apr 20 16:23:13.319595 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.319565 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.352629 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.352603 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:13.420276 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.420098 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.451470 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:13.451439 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe42efe7b737ea774d310634568d2b9.slice/crio-9fc78dbc143bbe9a5907afe345e81612cc232c4c5870933cb47a726f06aeb599 WatchSource:0}: Error finding container 9fc78dbc143bbe9a5907afe345e81612cc232c4c5870933cb47a726f06aeb599: Status 404 returned error can't find the container with id 9fc78dbc143bbe9a5907afe345e81612cc232c4c5870933cb47a726f06aeb599 Apr 20 16:23:13.451850 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:13.451834 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf1f649d76438d5a7a5af68df1c18c0.slice/crio-99f95aabdf92d11a1f075675952d222670e934d3e347c274bfe5e41a7f900dc6 WatchSource:0}: Error finding container 99f95aabdf92d11a1f075675952d222670e934d3e347c274bfe5e41a7f900dc6: Status 404 returned error can't find the container with id 99f95aabdf92d11a1f075675952d222670e934d3e347c274bfe5e41a7f900dc6 Apr 20 16:23:13.456375 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.456352 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:23:13.520503 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:13.520392 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-44.ec2.internal\" not found" Apr 20 16:23:13.619865 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.618578 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:13.619865 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.619861 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" Apr 20 16:23:13.633240 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.633215 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:13.635381 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.635366 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" Apr 20 16:23:13.641784 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.641750 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 16:23:13.770229 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:13.770193 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:14.198423 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.198391 2576 apiserver.go:52] "Watching apiserver" Apr 20 16:23:14.208411 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.208383 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 16:23:14.209049 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.208999 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s95ld","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd","openshift-dns/node-resolver-tnl7m","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal","openshift-multus/multus-v7jxk","openshift-multus/network-metrics-daemon-tr5xd","openshift-network-diagnostics/network-check-target-cq4h4","openshift-network-operator/iptables-alerter-f2nfg","kube-system/konnectivity-agent-7q74k","kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal","openshift-cluster-node-tuning-operator/tuned-zblls","openshift-image-registry/node-ca-wn2xt","openshift-multus/multus-additional-cni-plugins-wsgnn"] Apr 20 16:23:14.212355 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.212330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.213560 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.213533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.215035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.215010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.215265 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.215246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.215377 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.215359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.216126 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.216101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 16:23:14.216330 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.216315 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.217011 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.216990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.219413 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.219704 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.219783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.219851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.220013 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 16:23:14.220744 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.220174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.221917 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.221330 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 16:23:14.221917 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.221608 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.222668 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.222645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 16:23:14.223039 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.223020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.223845 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.223503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7czc8\"" Apr 20 16:23:14.223845 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.223506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bn2x\"" Apr 20 16:23:14.223995 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.223895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 16:23:14.224105 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.224086 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9ln2x\"" Apr 20 16:23:14.224320 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.224291 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:14.224871 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.224850 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 16:23:14.225260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.225047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q9l2q\"" Apr 20 16:23:14.225337 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.225319 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.227348 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.225908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.227348 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.226462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:14.227348 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.226533 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:14.227631 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.227612 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.228591 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228491 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.228591 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228512 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 16:23:14.228591 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228528 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.228779 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7p46\"" Apr 20 16:23:14.228779 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dbrc9\"" Apr 20 16:23:14.229001 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.228984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.229518 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.229500 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.229604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.229522 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 16:23:14.231779 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.231749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.231929 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.231908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n4wl7\"" Apr 20 16:23:14.232099 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-socket-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.232156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-registration-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.232156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.231802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 16:23:14.232156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-device-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.232260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.232260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-os-release\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-cni-binary-copy\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-sys-fs\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-bin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-etc-kubernetes\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232342 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-kubelet\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-daemon-config\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-slash\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-systemd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-ovn\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-env-overrides\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovn-node-metrics-cert\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-conf-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-netns\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4hs\" (UniqueName: \"kubernetes.io/projected/7948c105-68aa-437a-a0ac-fa0d535c7b37-kube-api-access-8j4hs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-multus\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.232697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-hostroot\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-socket-dir-parent\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-k8s-cni-cncf-io\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-host\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-var-lib-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-etc-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-system-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.232997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-netns\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-node-log\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-log-socket\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-bin\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqsgm\" (UniqueName: \"kubernetes.io/projected/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kube-api-access-gqsgm\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrsr\" (UniqueName: \"kubernetes.io/projected/0f424042-eb12-467e-85c1-cbdd302c3e4d-kube-api-access-zjrsr\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-cnibin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.233324 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-netd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-config\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-script-lib\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x89l\" (UniqueName: \"kubernetes.io/projected/3c78e1c2-fb6e-458b-8593-64d3e48a714e-kube-api-access-9x89l\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f424042-eb12-467e-85c1-cbdd302c3e4d-hosts-file\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f424042-eb12-467e-85c1-cbdd302c3e4d-tmp-dir\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-multus-certs\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87c8\" (UniqueName: \"kubernetes.io/projected/23487f52-5abf-4f26-b6e5-427ce8611cdb-kube-api-access-j87c8\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-serviceca\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk5v7\" (UniqueName: \"kubernetes.io/projected/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-kube-api-access-vk5v7\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-systemd-units\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.234194 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.233633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-kubelet\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.234678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.234661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vkvxh\"" Apr 20 16:23:14.234896 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.234877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 16:23:14.236329 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.236311 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 16:23:14.237041 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.237018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.239320 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.239302 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:23:14.239421 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.239344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vgjrd\"" Apr 20 16:23:14.239421 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.239378 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 16:23:14.257651 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.257617 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:13 +0000 UTC" deadline="2027-12-25 10:00:13.196551805 +0000 UTC" Apr 20 16:23:14.257788 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.257688 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14729h36m58.938867964s" Apr 20 16:23:14.321798 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.321746 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-kubelet\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-registration-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-kubelet\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-os-release\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-registration-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-cni-binary-copy\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-os-release\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.334961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-etc-selinux\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.334982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65cbd12-d018-46ff-8d22-86c1a6fb9204-host-slash\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-sys-fs\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-sys-fs\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b0527932-43f9-4c44-8a48-e6b0fc353de6-konnectivity-ca\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.335262 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-daemon-config\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.335374 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:14.835350506 +0000 UTC m=+3.061558442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovn-node-metrics-cert\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.335513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-cni-binary-copy\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-conf-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-netns\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4hs\" (UniqueName: \"kubernetes.io/projected/7948c105-68aa-437a-a0ac-fa0d535c7b37-kube-api-access-8j4hs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b0527932-43f9-4c44-8a48-e6b0fc353de6-agent-certs\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-system-cni-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-conf-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-env-overrides\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-hostroot\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335840 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-daemon-config\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-hostroot\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-socket-dir-parent\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-netns\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-host\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.335998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-socket-dir-parent\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.336189 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cnibin\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-host\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-run-ovn-kubernetes\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-env-overrides\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-run\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-system-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-netns\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-node-log\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-system-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-config\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-netns\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-multus-cni-dir\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-node-log\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-script-lib\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-kubernetes\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqsgm\" (UniqueName: \"kubernetes.io/projected/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kube-api-access-gqsgm\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.337032 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-cnibin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-netd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x89l\" (UniqueName: \"kubernetes.io/projected/3c78e1c2-fb6e-458b-8593-64d3e48a714e-kube-api-access-9x89l\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-multus-certs\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j87c8\" (UniqueName: \"kubernetes.io/projected/23487f52-5abf-4f26-b6e5-427ce8611cdb-kube-api-access-j87c8\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-serviceca\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-host\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-cnibin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-socket-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-device-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-bin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-modprobe-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-tuned\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-etc-kubernetes\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-script-lib\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.337910 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-kubelet\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-socket-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovnkube-config\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-binary-copy\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-etc-kubernetes\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-conf\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-tmp\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-slash\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-multus-certs\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.336601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-netd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-multus\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-slash\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-k8s-cni-cncf-io\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-var-lib-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-bin\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.338685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-kubelet\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-etc-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-device-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65cbd12-d018-46ff-8d22-86c1a6fb9204-iptables-alerter-script\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-var-lib-cni-multus\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-serviceca\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scdrd\" (UniqueName: \"kubernetes.io/projected/1bb216ed-aa87-4017-b000-0f3d37d1fda9-kube-api-access-scdrd\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23487f52-5abf-4f26-b6e5-427ce8611cdb-host-run-k8s-cni-cncf-io\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysconfig\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-var-lib-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-etc-openvswitch\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-log-socket\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-bin\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-log-socket\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckpq\" (UniqueName: \"kubernetes.io/projected/c65cbd12-d018-46ff-8d22-86c1a6fb9204-kube-api-access-wckpq\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-host-cni-bin\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.339459 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-systemd\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-sys\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-lib-modules\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-var-lib-kubelet\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjrsr\" (UniqueName: \"kubernetes.io/projected/0f424042-eb12-467e-85c1-cbdd302c3e4d-kube-api-access-zjrsr\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjm2v\" (UniqueName: \"kubernetes.io/projected/9df8f577-8118-408c-a41e-c4568ea0c8ef-kube-api-access-jjm2v\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-systemd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-ovn\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-systemd\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f424042-eb12-467e-85c1-cbdd302c3e4d-hosts-file\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f424042-eb12-467e-85c1-cbdd302c3e4d-tmp-dir\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-run-ovn\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk5v7\" (UniqueName: \"kubernetes.io/projected/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-kube-api-access-vk5v7\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-systemd-units\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-os-release\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f424042-eb12-467e-85c1-cbdd302c3e4d-hosts-file\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.337831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c78e1c2-fb6e-458b-8593-64d3e48a714e-systemd-units\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.340227 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.338087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f424042-eb12-467e-85c1-cbdd302c3e4d-tmp-dir\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.341056 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.340782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c78e1c2-fb6e-458b-8593-64d3e48a714e-ovn-node-metrics-cert\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.343452 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.343375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4hs\" (UniqueName: \"kubernetes.io/projected/7948c105-68aa-437a-a0ac-fa0d535c7b37-kube-api-access-8j4hs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.344102 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.344077 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:14.344194 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.344105 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:14.344194 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.344120 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:14.344194 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.344177 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:14.844158614 +0000 UTC m=+3.070366540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:14.345508 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.345488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87c8\" (UniqueName: \"kubernetes.io/projected/23487f52-5abf-4f26-b6e5-427ce8611cdb-kube-api-access-j87c8\") pod \"multus-v7jxk\" (UID: \"23487f52-5abf-4f26-b6e5-427ce8611cdb\") " pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.345844 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.345823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjrsr\" (UniqueName: \"kubernetes.io/projected/0f424042-eb12-467e-85c1-cbdd302c3e4d-kube-api-access-zjrsr\") pod \"node-resolver-tnl7m\" (UID: \"0f424042-eb12-467e-85c1-cbdd302c3e4d\") " pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.345929 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.345873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk5v7\" (UniqueName: \"kubernetes.io/projected/bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7-kube-api-access-vk5v7\") pod \"node-ca-wn2xt\" (UID: \"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7\") " pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.346209 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.346187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x89l\" (UniqueName: \"kubernetes.io/projected/3c78e1c2-fb6e-458b-8593-64d3e48a714e-kube-api-access-9x89l\") pod \"ovnkube-node-s95ld\" (UID: \"3c78e1c2-fb6e-458b-8593-64d3e48a714e\") " pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.346463 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.346441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqsgm\" (UniqueName: \"kubernetes.io/projected/8173344d-6d10-4de8-8cc3-ce875eb9dc21-kube-api-access-gqsgm\") pod \"aws-ebs-csi-driver-node-gfjdd\" (UID: \"8173344d-6d10-4de8-8cc3-ce875eb9dc21\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.361460 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.361406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" event={"ID":"baf1f649d76438d5a7a5af68df1c18c0","Type":"ContainerStarted","Data":"99f95aabdf92d11a1f075675952d222670e934d3e347c274bfe5e41a7f900dc6"} Apr 20 16:23:14.362305 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.362284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" event={"ID":"1fe42efe7b737ea774d310634568d2b9","Type":"ContainerStarted","Data":"9fc78dbc143bbe9a5907afe345e81612cc232c4c5870933cb47a726f06aeb599"} Apr 20 16:23:14.438269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65cbd12-d018-46ff-8d22-86c1a6fb9204-iptables-alerter-script\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.438269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scdrd\" (UniqueName: \"kubernetes.io/projected/1bb216ed-aa87-4017-b000-0f3d37d1fda9-kube-api-access-scdrd\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysconfig\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wckpq\" (UniqueName: \"kubernetes.io/projected/c65cbd12-d018-46ff-8d22-86c1a6fb9204-kube-api-access-wckpq\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-systemd\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysconfig\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-sys\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-lib-modules\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-var-lib-kubelet\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438441 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-systemd\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjm2v\" (UniqueName: \"kubernetes.io/projected/9df8f577-8118-408c-a41e-c4568ea0c8ef-kube-api-access-jjm2v\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-var-lib-kubelet\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-os-release\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-sys\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65cbd12-d018-46ff-8d22-86c1a6fb9204-host-slash\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b0527932-43f9-4c44-8a48-e6b0fc353de6-konnectivity-ca\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-lib-modules\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65cbd12-d018-46ff-8d22-86c1a6fb9204-host-slash\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-os-release\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b0527932-43f9-4c44-8a48-e6b0fc353de6-agent-certs\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-system-cni-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cnibin\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-system-cni-dir\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cnibin\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-run\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.438951 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-run\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65cbd12-d018-46ff-8d22-86c1a6fb9204-iptables-alerter-script\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-kubernetes\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-host\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-modprobe-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-kubernetes\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-tuned\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-host\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.438996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-binary-copy\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-modprobe-d\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-conf\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-tmp\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b0527932-43f9-4c44-8a48-e6b0fc353de6-konnectivity-ca\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-sysctl-conf\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.439644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.440504 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.439844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-binary-copy\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.440635 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.440612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1bb216ed-aa87-4017-b000-0f3d37d1fda9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.441886 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.441862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-tmp\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.441886 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.441880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b0527932-43f9-4c44-8a48-e6b0fc353de6-agent-certs\") pod \"konnectivity-agent-7q74k\" (UID: \"b0527932-43f9-4c44-8a48-e6b0fc353de6\") " pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.442049 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.441896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9df8f577-8118-408c-a41e-c4568ea0c8ef-etc-tuned\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.446512 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.446460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckpq\" (UniqueName: \"kubernetes.io/projected/c65cbd12-d018-46ff-8d22-86c1a6fb9204-kube-api-access-wckpq\") pod \"iptables-alerter-f2nfg\" (UID: \"c65cbd12-d018-46ff-8d22-86c1a6fb9204\") " pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.446696 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.446675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scdrd\" (UniqueName: \"kubernetes.io/projected/1bb216ed-aa87-4017-b000-0f3d37d1fda9-kube-api-access-scdrd\") pod \"multus-additional-cni-plugins-wsgnn\" (UID: \"1bb216ed-aa87-4017-b000-0f3d37d1fda9\") " pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.447358 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.447334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjm2v\" (UniqueName: \"kubernetes.io/projected/9df8f577-8118-408c-a41e-c4568ea0c8ef-kube-api-access-jjm2v\") pod \"tuned-zblls\" (UID: \"9df8f577-8118-408c-a41e-c4568ea0c8ef\") " pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.539737 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.539650 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wn2xt" Apr 20 16:23:14.548386 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.548359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tnl7m" Apr 20 16:23:14.557111 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.557082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:14.561930 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.561899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v7jxk" Apr 20 16:23:14.569604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.569580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" Apr 20 16:23:14.577265 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.577236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2nfg" Apr 20 16:23:14.585945 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.585912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:14.594580 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.594552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" Apr 20 16:23:14.601334 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.601308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zblls" Apr 20 16:23:14.708773 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.708733 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 16:23:14.841678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.841603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:14.841832 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.841710 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:14.841832 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.841798 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:15.841767478 +0000 UTC m=+4.067975411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:14.942749 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:14.942715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:14.942912 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.942857 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:14.942912 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.942874 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:14.942912 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.942885 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:14.943041 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:14.942941 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:15.942922758 +0000 UTC m=+4.169130699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:15.173469 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.173440 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb216ed_aa87_4017_b000_0f3d37d1fda9.slice/crio-72a372baad50d71c4f92791da25a97c53244fe7024235db508920048132a6cd8 WatchSource:0}: Error finding container 72a372baad50d71c4f92791da25a97c53244fe7024235db508920048132a6cd8: Status 404 returned error can't find the container with id 72a372baad50d71c4f92791da25a97c53244fe7024235db508920048132a6cd8 Apr 20 16:23:15.188831 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.188718 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0527932_43f9_4c44_8a48_e6b0fc353de6.slice/crio-0cedb90de00358caf233efc66d5e04aac34ef22c00b087f467ce5fe3ceff9d4f WatchSource:0}: Error finding container 0cedb90de00358caf233efc66d5e04aac34ef22c00b087f467ce5fe3ceff9d4f: Status 404 returned error can't find the container with id 0cedb90de00358caf233efc66d5e04aac34ef22c00b087f467ce5fe3ceff9d4f Apr 20 16:23:15.190974 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.190925 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23487f52_5abf_4f26_b6e5_427ce8611cdb.slice/crio-27ee99d53d66fe2bb6c1cd3dc13d70f0ba96500259a3b74097cf7b4717afa80a WatchSource:0}: Error finding container 27ee99d53d66fe2bb6c1cd3dc13d70f0ba96500259a3b74097cf7b4717afa80a: Status 404 returned error can't find the container with id 27ee99d53d66fe2bb6c1cd3dc13d70f0ba96500259a3b74097cf7b4717afa80a Apr 20 16:23:15.191858 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.191833 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c78e1c2_fb6e_458b_8593_64d3e48a714e.slice/crio-882bc0d7d5ff4f15f09bddcf905336e721869ed32d7f4e7cfbf2afeadac92a0c WatchSource:0}: Error finding container 882bc0d7d5ff4f15f09bddcf905336e721869ed32d7f4e7cfbf2afeadac92a0c: Status 404 returned error can't find the container with id 882bc0d7d5ff4f15f09bddcf905336e721869ed32d7f4e7cfbf2afeadac92a0c Apr 20 16:23:15.192738 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.192718 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc10ffb1_dd19_4a22_a3ed_7437a80f1ba7.slice/crio-5481baaf06969ca17661a94dea215dcd55981bee3aba66d3bbfde0f30b01e18b WatchSource:0}: Error finding container 5481baaf06969ca17661a94dea215dcd55981bee3aba66d3bbfde0f30b01e18b: Status 404 returned error can't find the container with id 5481baaf06969ca17661a94dea215dcd55981bee3aba66d3bbfde0f30b01e18b Apr 20 16:23:15.193499 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.193411 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8173344d_6d10_4de8_8cc3_ce875eb9dc21.slice/crio-9e054b32e58b8acbda745873a7d5a16f5d858192b558ec3251174d0a770bf381 WatchSource:0}: Error finding container 9e054b32e58b8acbda745873a7d5a16f5d858192b558ec3251174d0a770bf381: Status 404 returned error can't find the container with id 9e054b32e58b8acbda745873a7d5a16f5d858192b558ec3251174d0a770bf381 Apr 20 16:23:15.194298 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.194180 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f424042_eb12_467e_85c1_cbdd302c3e4d.slice/crio-9cd198f087007882278720a94e60820cd3610898b1036a8bbe33cbfddc8a31b9 WatchSource:0}: Error finding container 9cd198f087007882278720a94e60820cd3610898b1036a8bbe33cbfddc8a31b9: Status 404 returned error can't find the container with id 9cd198f087007882278720a94e60820cd3610898b1036a8bbe33cbfddc8a31b9 Apr 20 16:23:15.195473 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.195452 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df8f577_8118_408c_a41e_c4568ea0c8ef.slice/crio-7a887e80afd19a145cf2dd843bc5310d9591d61753f31d1e5cb7279dc6656cb4 WatchSource:0}: Error finding container 7a887e80afd19a145cf2dd843bc5310d9591d61753f31d1e5cb7279dc6656cb4: Status 404 returned error can't find the container with id 7a887e80afd19a145cf2dd843bc5310d9591d61753f31d1e5cb7279dc6656cb4 Apr 20 16:23:15.197293 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:15.197271 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc65cbd12_d018_46ff_8d22_86c1a6fb9204.slice/crio-709eb89a5a00062b4eeb15869cfdf683ee0ade887d05f97f42363a350d78728d WatchSource:0}: Error finding container 709eb89a5a00062b4eeb15869cfdf683ee0ade887d05f97f42363a350d78728d: Status 404 returned error can't find the container with id 709eb89a5a00062b4eeb15869cfdf683ee0ade887d05f97f42363a350d78728d Apr 20 16:23:15.258632 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.258601 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 16:18:13 +0000 UTC" deadline="2027-12-25 16:47:51.737205758 +0000 UTC" Apr 20 16:23:15.258632 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.258626 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14736h24m36.478581881s" Apr 20 16:23:15.364945 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.364900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v7jxk" event={"ID":"23487f52-5abf-4f26-b6e5-427ce8611cdb","Type":"ContainerStarted","Data":"27ee99d53d66fe2bb6c1cd3dc13d70f0ba96500259a3b74097cf7b4717afa80a"} Apr 20 16:23:15.365897 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.365872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7q74k" event={"ID":"b0527932-43f9-4c44-8a48-e6b0fc353de6","Type":"ContainerStarted","Data":"0cedb90de00358caf233efc66d5e04aac34ef22c00b087f467ce5fe3ceff9d4f"} Apr 20 16:23:15.366710 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.366686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerStarted","Data":"72a372baad50d71c4f92791da25a97c53244fe7024235db508920048132a6cd8"} Apr 20 16:23:15.368028 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.368001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" event={"ID":"1fe42efe7b737ea774d310634568d2b9","Type":"ContainerStarted","Data":"2ab9fff981ca8f16437dd360ed02d2515eba318b05fdee116c26ceb441c31658"} Apr 20 16:23:15.368960 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.368941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zblls" event={"ID":"9df8f577-8118-408c-a41e-c4568ea0c8ef","Type":"ContainerStarted","Data":"7a887e80afd19a145cf2dd843bc5310d9591d61753f31d1e5cb7279dc6656cb4"} Apr 20 16:23:15.369839 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.369819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" event={"ID":"8173344d-6d10-4de8-8cc3-ce875eb9dc21","Type":"ContainerStarted","Data":"9e054b32e58b8acbda745873a7d5a16f5d858192b558ec3251174d0a770bf381"} Apr 20 16:23:15.370718 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.370688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wn2xt" event={"ID":"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7","Type":"ContainerStarted","Data":"5481baaf06969ca17661a94dea215dcd55981bee3aba66d3bbfde0f30b01e18b"} Apr 20 16:23:15.371694 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.371675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2nfg" event={"ID":"c65cbd12-d018-46ff-8d22-86c1a6fb9204","Type":"ContainerStarted","Data":"709eb89a5a00062b4eeb15869cfdf683ee0ade887d05f97f42363a350d78728d"} Apr 20 16:23:15.372572 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.372555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tnl7m" event={"ID":"0f424042-eb12-467e-85c1-cbdd302c3e4d","Type":"ContainerStarted","Data":"9cd198f087007882278720a94e60820cd3610898b1036a8bbe33cbfddc8a31b9"} Apr 20 16:23:15.373433 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.373415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"882bc0d7d5ff4f15f09bddcf905336e721869ed32d7f4e7cfbf2afeadac92a0c"} Apr 20 16:23:15.380654 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.380611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-44.ec2.internal" podStartSLOduration=2.380598335 podStartE2EDuration="2.380598335s" podCreationTimestamp="2026-04-20 16:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:15.380244752 +0000 UTC m=+3.606452695" watchObservedRunningTime="2026-04-20 16:23:15.380598335 +0000 UTC m=+3.606806280" Apr 20 16:23:15.849510 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.849471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:15.849684 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.849654 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:15.849739 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.849718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:17.849698319 +0000 UTC m=+6.075906262 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:15.950844 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:15.950812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:15.950975 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.950965 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:15.951030 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.950984 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:15.951030 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.950997 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:15.951140 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:15.951056 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:17.951037366 +0000 UTC m=+6.177245287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:16.224041 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.224008 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-28hkj"] Apr 20 16:23:16.226133 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.226108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.226289 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.226193 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:16.252524 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.252486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-kubelet-config\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.252716 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.252549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-dbus\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.252716 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.252588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.353119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-dbus\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.353195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.353254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-kubelet-config\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.353350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-kubelet-config\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.353431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09560999-6ebd-4da5-b805-d700919dfb04-dbus\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.353522 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:16.353604 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.353591 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:16.853572187 +0000 UTC m=+5.079780109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:16.360475 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.360400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:16.360613 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.360513 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:16.360613 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.360611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:16.360753 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.360695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:16.390419 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.390358 2576 generic.go:358] "Generic (PLEG): container finished" podID="baf1f649d76438d5a7a5af68df1c18c0" containerID="1dae6de36cb32fed216710b2058cd73657ae37435ccba5007b5e42e1c7f627b2" exitCode=0 Apr 20 16:23:16.390585 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.390560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" event={"ID":"baf1f649d76438d5a7a5af68df1c18c0","Type":"ContainerDied","Data":"1dae6de36cb32fed216710b2058cd73657ae37435ccba5007b5e42e1c7f627b2"} Apr 20 16:23:16.856386 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:16.856347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:16.861211 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.861179 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:16.861364 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:16.861271 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:17.861250796 +0000 UTC m=+6.087458730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:17.409538 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:17.409451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" event={"ID":"baf1f649d76438d5a7a5af68df1c18c0","Type":"ContainerStarted","Data":"24d1cc48a499298ac29c152e67fd357e73dcdb1b73725eddce36c29e27ae156c"} Apr 20 16:23:17.423201 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:17.423142 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-44.ec2.internal" podStartSLOduration=4.423122922 podStartE2EDuration="4.423122922s" podCreationTimestamp="2026-04-20 16:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:23:17.422499605 +0000 UTC m=+5.648707548" watchObservedRunningTime="2026-04-20 16:23:17.423122922 +0000 UTC m=+5.649330865" Apr 20 16:23:17.866474 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:17.866388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:17.866647 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:17.866481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:17.866647 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.866637 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:17.866781 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.866700 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:19.866681899 +0000 UTC m=+8.092889825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:17.867026 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.866879 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:17.867026 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.866927 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:21.866914365 +0000 UTC m=+10.093122289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:17.967420 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:17.967381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:17.967609 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.967551 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:17.967609 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.967572 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:17.967609 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.967583 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:17.967834 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:17.967640 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:21.967620893 +0000 UTC m=+10.193828829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:18.358141 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:18.358049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:18.358294 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:18.358188 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:18.358791 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:18.358559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:18.358791 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:18.358588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:18.358791 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:18.358674 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:18.358791 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:18.358775 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:19.886216 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:19.885849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:19.886216 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:19.886037 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:19.886216 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:19.886117 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:23.88609057 +0000 UTC m=+12.112298505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:20.358831 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:20.358737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:20.358980 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:20.358877 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:20.358980 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:20.358880 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:20.358980 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:20.358953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:20.359135 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:20.359105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:20.359228 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:20.359201 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:21.900984 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:21.900947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:21.901433 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:21.901099 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:21.901433 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:21.901161 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:29.901142898 +0000 UTC m=+18.127350824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:22.001728 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:22.001686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:22.001918 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.001889 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:22.001918 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.001919 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:22.002019 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.001933 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:22.002019 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.001994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:30.00197609 +0000 UTC m=+18.228184016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:22.359458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.359941 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:22.360005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.360055 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:22.360381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:22.360534 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:22.360440 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:23.923338 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:23.923299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:23.923792 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:23.923434 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:23.923792 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:23.923499 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:31.923480319 +0000 UTC m=+20.149688241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:24.358269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:24.358209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:24.358431 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:24.358274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:24.358431 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:24.358375 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:24.358431 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:24.358384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:24.358595 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:24.358499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:24.358595 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:24.358555 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:26.358307 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:26.358270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:26.358697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:26.358329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:26.358697 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:26.358400 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:26.358697 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:26.358464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:26.358697 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:26.358544 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:26.358697 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:26.358593 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:28.358906 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:28.358867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:28.359314 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:28.358868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:28.359314 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:28.358983 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:28.359314 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:28.359079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:28.359314 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:28.358873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:28.359314 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:28.359205 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:29.965539 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:29.965496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:29.966001 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:29.965659 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:29.966001 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:29.965734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:45.965714103 +0000 UTC m=+34.191922023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:30.066714 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:30.066667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:30.066911 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.066880 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:30.066911 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.066907 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:30.067014 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.066918 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:30.067014 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.066984 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:46.066964989 +0000 UTC m=+34.293172931 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:30.358428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:30.358386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:30.358428 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:30.358387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:30.358651 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:30.358453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:30.358651 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.358521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:30.358651 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.358584 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:30.358780 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:30.358651 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:31.978336 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:31.978288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:31.978783 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:31.978411 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:31.978783 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:31.978464 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret podName:09560999-6ebd-4da5-b805-d700919dfb04 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:47.978450372 +0000 UTC m=+36.204658292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret") pod "global-pull-secret-syncer-28hkj" (UID: "09560999-6ebd-4da5-b805-d700919dfb04") : object "kube-system"/"original-pull-secret" not registered Apr 20 16:23:32.358416 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.358388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:32.358552 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:32.358526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:32.359455 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.358991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:32.359455 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:32.359100 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:32.360229 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.360201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:32.360331 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:32.360297 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:32.440548 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.440503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"1d84beb598855fd9c817b766b620ed594facc62e88ff3d299c2dd6beee832bf4"} Apr 20 16:23:32.442537 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.442450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v7jxk" event={"ID":"23487f52-5abf-4f26-b6e5-427ce8611cdb","Type":"ContainerStarted","Data":"7bfcaa590f3afa1e994a464f3e6e0bab773d4760a5cbe72748245050a65ad6f1"} Apr 20 16:23:32.445297 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.444686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7q74k" event={"ID":"b0527932-43f9-4c44-8a48-e6b0fc353de6","Type":"ContainerStarted","Data":"85f91f02e7ef90ee71e810eed6cc60b8710810d3eccca84cfcfee507880f62d4"} Apr 20 16:23:32.448304 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.447988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zblls" event={"ID":"9df8f577-8118-408c-a41e-c4568ea0c8ef","Type":"ContainerStarted","Data":"3791d1830ce25dba81073797e3c8a47941c3a72a4ad9315a0c297e79e6e19d1f"} Apr 20 16:23:32.457876 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.457490 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v7jxk" podStartSLOduration=3.484265229 podStartE2EDuration="20.457474216s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.1929506 +0000 UTC m=+3.419158521" lastFinishedPulling="2026-04-20 16:23:32.166159586 +0000 UTC m=+20.392367508" observedRunningTime="2026-04-20 16:23:32.456618913 +0000 UTC m=+20.682826856" watchObservedRunningTime="2026-04-20 16:23:32.457474216 +0000 UTC m=+20.683682159" Apr 20 16:23:32.473333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.472842 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7q74k" podStartSLOduration=3.53273501 podStartE2EDuration="20.472821343s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.190597345 +0000 UTC m=+3.416805265" lastFinishedPulling="2026-04-20 16:23:32.130683673 +0000 UTC m=+20.356891598" observedRunningTime="2026-04-20 16:23:32.472355829 +0000 UTC m=+20.698563774" watchObservedRunningTime="2026-04-20 16:23:32.472821343 +0000 UTC m=+20.699029287" Apr 20 16:23:32.486080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:32.486040 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zblls" podStartSLOduration=3.552538649 podStartE2EDuration="20.486025747s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.197229944 +0000 UTC m=+3.423437869" lastFinishedPulling="2026-04-20 16:23:32.130717044 +0000 UTC m=+20.356924967" observedRunningTime="2026-04-20 16:23:32.485964645 +0000 UTC m=+20.712172588" watchObservedRunningTime="2026-04-20 16:23:32.486025747 +0000 UTC m=+20.712233689" Apr 20 16:23:33.451224 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.451187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wn2xt" event={"ID":"bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7","Type":"ContainerStarted","Data":"ae58bd4ce4a858db90e2fff76e2d1b0a48b086c5dca8faa6015aee7cc6b57a16"} Apr 20 16:23:33.452393 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.452329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tnl7m" event={"ID":"0f424042-eb12-467e-85c1-cbdd302c3e4d","Type":"ContainerStarted","Data":"668f66be9e85f107dc4928a3ed158964493fcfb209812616c571864d3341fac7"} Apr 20 16:23:33.454532 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:23:33.454863 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454844 2576 generic.go:358] "Generic (PLEG): container finished" podID="3c78e1c2-fb6e-458b-8593-64d3e48a714e" containerID="3f616c5935a5942ec11dfc6f53f3c9e541e21f6fde4b2dc57b2e0e1d8c7d0a02" exitCode=1 Apr 20 16:23:33.454916 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"7c79b2e09090aa05768694f10e908cfe916b0eb78a777abece03cfbf4de40621"} Apr 20 16:23:33.454947 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"8a3c719df46e44f9892d3ad75a70b4052003248a9520d7c1a4005742b9b8165d"} Apr 20 16:23:33.454947 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"d7c286c6c170c41734c7106c39bc53cadf46bc9c5ca092bbce2eb343b4829a03"} Apr 20 16:23:33.454947 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"7de56c46f5878a8e9dc5ddcf587da42b7504a293a49f57da134f27a3a59441e3"} Apr 20 16:23:33.455061 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.454955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerDied","Data":"3f616c5935a5942ec11dfc6f53f3c9e541e21f6fde4b2dc57b2e0e1d8c7d0a02"} Apr 20 16:23:33.456149 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.456123 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="8652d9a9a9db8a258d0dcfbc96984c37317f212624627488d82958ceb4eedc35" exitCode=0 Apr 20 16:23:33.456260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.456151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"8652d9a9a9db8a258d0dcfbc96984c37317f212624627488d82958ceb4eedc35"} Apr 20 16:23:33.457612 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.457382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" event={"ID":"8173344d-6d10-4de8-8cc3-ce875eb9dc21","Type":"ContainerStarted","Data":"584ca897dcae4b5f8165118da65708dd211fa8efb18683683154d1fde52a754a"} Apr 20 16:23:33.463038 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.462998 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wn2xt" podStartSLOduration=4.526961155 podStartE2EDuration="21.462986923s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.19468657 +0000 UTC m=+3.420894503" lastFinishedPulling="2026-04-20 16:23:32.13071235 +0000 UTC m=+20.356920271" observedRunningTime="2026-04-20 16:23:33.462939921 +0000 UTC m=+21.689147863" watchObservedRunningTime="2026-04-20 16:23:33.462986923 +0000 UTC m=+21.689194865" Apr 20 16:23:33.474888 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.474839 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tnl7m" podStartSLOduration=4.540295043 podStartE2EDuration="21.474823889s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.196567133 +0000 UTC m=+3.422775067" lastFinishedPulling="2026-04-20 16:23:32.131095988 +0000 UTC m=+20.357303913" observedRunningTime="2026-04-20 16:23:33.474799115 +0000 UTC m=+21.701007056" watchObservedRunningTime="2026-04-20 16:23:33.474823889 +0000 UTC m=+21.701031809" Apr 20 16:23:33.981667 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:33.981640 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 16:23:34.302017 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.301893 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T16:23:33.981660351Z","UUID":"6c2ce794-6cd8-45b0-97cf-049380d78df3","Handler":null,"Name":"","Endpoint":""} Apr 20 16:23:34.303426 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.303406 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 16:23:34.303426 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.303433 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 16:23:34.358418 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.358378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:34.358601 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:34.358509 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:34.358601 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.358539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:34.358716 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.358542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:34.358787 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:34.358652 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:34.358853 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:34.358787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:34.460773 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.460719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2nfg" event={"ID":"c65cbd12-d018-46ff-8d22-86c1a6fb9204","Type":"ContainerStarted","Data":"032b7aebe835ad0ded2ea6041826c248ba8005a1ab5a1aab970b8cab779eb923"} Apr 20 16:23:34.462309 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.462284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" event={"ID":"8173344d-6d10-4de8-8cc3-ce875eb9dc21","Type":"ContainerStarted","Data":"8b341512865ceaa4d0ddf3202e5bde9a9edb9cf1e21130436f34a5f7b300453c"} Apr 20 16:23:34.513644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.513564 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f2nfg" podStartSLOduration=5.582327344 podStartE2EDuration="22.513545369s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.199478107 +0000 UTC m=+3.425686039" lastFinishedPulling="2026-04-20 16:23:32.130696144 +0000 UTC m=+20.356904064" observedRunningTime="2026-04-20 16:23:34.512892763 +0000 UTC m=+22.739100706" watchObservedRunningTime="2026-04-20 16:23:34.513545369 +0000 UTC m=+22.739753314" Apr 20 16:23:34.739610 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.739518 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:34.740242 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:34.740217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:35.467523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:35.467500 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:23:35.468056 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:35.467898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"8df847dc4a73bec1aa2f856cbd37168e513c1f2377236636320b2a0a6c0e33a2"} Apr 20 16:23:36.358082 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:36.358047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:36.358082 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:36.358067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:36.358341 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:36.358047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:36.358341 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:36.358180 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:36.358341 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:36.358262 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:36.358341 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:36.358328 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:36.472250 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:36.472211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" event={"ID":"8173344d-6d10-4de8-8cc3-ce875eb9dc21","Type":"ContainerStarted","Data":"12f5240d28fe1a7715d9068fd3969791a62db6132e38718ca5cfa25703483713"} Apr 20 16:23:36.489134 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:36.489083 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gfjdd" podStartSLOduration=4.219260691 podStartE2EDuration="24.489067627s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.195693648 +0000 UTC m=+3.421901579" lastFinishedPulling="2026-04-20 16:23:35.465500576 +0000 UTC m=+23.691708515" observedRunningTime="2026-04-20 16:23:36.488630474 +0000 UTC m=+24.714838418" watchObservedRunningTime="2026-04-20 16:23:36.489067627 +0000 UTC m=+24.715275568" Apr 20 16:23:38.358381 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:38.358343 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:38.358852 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:38.358343 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:38.358852 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:38.358473 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:38.358852 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:38.358343 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:38.358852 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:38.358555 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:38.358852 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:38.358634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:39.481090 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.480927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:23:39.481555 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.481388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"5e77e93ba9960ca191f510e15ae1c0b45cae6517ab4c901ca69ad6301602c28f"} Apr 20 16:23:39.481688 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.481664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:39.481772 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.481699 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:39.481924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.481904 2576 scope.go:117] "RemoveContainer" containerID="3f616c5935a5942ec11dfc6f53f3c9e541e21f6fde4b2dc57b2e0e1d8c7d0a02" Apr 20 16:23:39.483164 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.483127 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="7ee6bf50b10a49da1a7378293b0c1188d9efd8850e01bce2ee9b775f31ed86c4" exitCode=0 Apr 20 16:23:39.483258 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.483182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"7ee6bf50b10a49da1a7378293b0c1188d9efd8850e01bce2ee9b775f31ed86c4"} Apr 20 16:23:39.497872 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.497847 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:39.911674 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.911472 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:39.911897 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.911807 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 16:23:39.912168 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:39.912142 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7q74k" Apr 20 16:23:40.358302 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.358269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:40.358479 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.358413 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:40.358924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.358569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:40.359085 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.359031 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:40.359195 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.359180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:40.359334 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.359286 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:40.487979 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.487954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:23:40.488351 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.488234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" event={"ID":"3c78e1c2-fb6e-458b-8593-64d3e48a714e","Type":"ContainerStarted","Data":"bb4c215130b4116b1ff74a84540cecd325275b97c090cff6b7ccebdf502fdb32"} Apr 20 16:23:40.488447 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.488432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:40.490135 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.490105 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="d80cc4977afd402b6a34d620a876ba9771a9881f92b8054a729ca69bd8aae571" exitCode=0 Apr 20 16:23:40.490245 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.490183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"d80cc4977afd402b6a34d620a876ba9771a9881f92b8054a729ca69bd8aae571"} Apr 20 16:23:40.502828 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.502776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:23:40.516535 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.516485 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" podStartSLOduration=11.490758714 podStartE2EDuration="28.51647101s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.194144807 +0000 UTC m=+3.420352742" lastFinishedPulling="2026-04-20 16:23:32.219857118 +0000 UTC m=+20.446065038" observedRunningTime="2026-04-20 16:23:40.516120323 +0000 UTC m=+28.742328264" watchObservedRunningTime="2026-04-20 16:23:40.51647101 +0000 UTC m=+28.742678952" Apr 20 16:23:40.640282 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.640213 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28hkj"] Apr 20 16:23:40.640410 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.640323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:40.640451 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.640428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:40.643410 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.643385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tr5xd"] Apr 20 16:23:40.643533 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.643503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:40.643664 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.643642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:40.644196 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.644176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cq4h4"] Apr 20 16:23:40.644272 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:40.644261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:40.644359 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:40.644339 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:41.494656 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:41.494574 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="01198477a13bd02e72499610e598b41bbb04d7fa93513911fc53f6d6544a2518" exitCode=0 Apr 20 16:23:41.495027 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:41.494656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"01198477a13bd02e72499610e598b41bbb04d7fa93513911fc53f6d6544a2518"} Apr 20 16:23:42.359748 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:42.359714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:42.359950 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:42.359843 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:42.359950 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:42.359860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:42.359950 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:42.359898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:42.360118 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:42.359977 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:42.360118 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:42.360082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:44.359003 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:44.358972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:44.359484 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:44.358975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:44.359484 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:44.359074 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cq4h4" podUID="3b741c5a-ce22-4075-86e3-0c1155e94215" Apr 20 16:23:44.359484 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:44.359158 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:23:44.359484 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:44.358975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:44.359484 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:44.359242 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28hkj" podUID="09560999-6ebd-4da5-b805-d700919dfb04" Apr 20 16:23:45.138687 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.138461 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-44.ec2.internal" event="NodeReady" Apr 20 16:23:45.138850 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.138829 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 16:23:45.172207 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.172163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj"] Apr 20 16:23:45.179645 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.179615 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:23:45.179826 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.179790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.182149 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 16:23:45.182706 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 16:23:45.182706 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182500 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 16:23:45.182706 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182507 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 16:23:45.182706 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182585 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn"] Apr 20 16:23:45.183001 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.182748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.184949 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.184927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 16:23:45.186333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.185415 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n"] Apr 20 16:23:45.186333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.185432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 16:23:45.186333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.185550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 16:23:45.186333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.185432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n225r\"" Apr 20 16:23:45.186333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.185891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.188810 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.188625 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t2rgz"] Apr 20 16:23:45.188906 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.188813 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.189706 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.189677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 16:23:45.190223 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.190172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-c4d84\"" Apr 20 16:23:45.193365 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 16:23:45.193365 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 16:23:45.193365 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 16:23:45.193365 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193287 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 16:23:45.193633 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193463 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj"] Apr 20 16:23:45.193633 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193487 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn"] Apr 20 16:23:45.193633 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193502 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7f8db"] Apr 20 16:23:45.193890 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.193641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.195659 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.195639 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 16:23:45.196597 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.196578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 16:23:45.196680 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.196598 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:23:45.196857 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.196841 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 16:23:45.197132 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.197113 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n"] Apr 20 16:23:45.197236 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.197137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:23:45.197236 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.197231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.200820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.200127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:23:45.200820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.200148 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 16:23:45.200820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.200364 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 16:23:45.200820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.200501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 16:23:45.200820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.200594 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t2rgz"] Apr 20 16:23:45.205797 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.205688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7f8db"] Apr 20 16:23:45.270021 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.269993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cbda6188-592c-42dc-b064-bb905f0b2e00-klusterlet-config\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.270239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcswx\" (UniqueName: \"kubernetes.io/projected/607b691a-53a8-4d6c-9a81-238041e2f614-kube-api-access-pcswx\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.270239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skhpg\" (UniqueName: \"kubernetes.io/projected/c0c73998-a3cf-46ac-88ee-04698be10974-kube-api-access-skhpg\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.270239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjzrn\" (UniqueName: \"kubernetes.io/projected/c666ba21-097e-4fb4-ac00-f607e9a9198f-kube-api-access-cjzrn\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.270239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/607b691a-53a8-4d6c-9a81-238041e2f614-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.270239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0c73998-a3cf-46ac-88ee-04698be10974-config-volume\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c666ba21-097e-4fb4-ac00-f607e9a9198f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4bb\" (UniqueName: \"kubernetes.io/projected/0dfe4973-64f0-41f7-a34e-6d35be53c155-kube-api-access-2f4bb\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcltf\" (UniqueName: \"kubernetes.io/projected/cbda6188-592c-42dc-b064-bb905f0b2e00-kube-api-access-tcltf\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbda6188-592c-42dc-b064-bb905f0b2e00-tmp\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0c73998-a3cf-46ac-88ee-04698be10974-tmp-dir\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.270790 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2x2z\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.271219 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.271219 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.270850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.371972 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.371930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.371993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cbda6188-592c-42dc-b064-bb905f0b2e00-klusterlet-config\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcswx\" (UniqueName: \"kubernetes.io/projected/607b691a-53a8-4d6c-9a81-238041e2f614-kube-api-access-pcswx\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skhpg\" (UniqueName: \"kubernetes.io/projected/c0c73998-a3cf-46ac-88ee-04698be10974-kube-api-access-skhpg\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjzrn\" (UniqueName: \"kubernetes.io/projected/c666ba21-097e-4fb4-ac00-f607e9a9198f-kube-api-access-cjzrn\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/607b691a-53a8-4d6c-9a81-238041e2f614-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0c73998-a3cf-46ac-88ee-04698be10974-config-volume\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.372440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c666ba21-097e-4fb4-ac00-f607e9a9198f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4bb\" (UniqueName: \"kubernetes.io/projected/0dfe4973-64f0-41f7-a34e-6d35be53c155-kube-api-access-2f4bb\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcltf\" (UniqueName: \"kubernetes.io/projected/cbda6188-592c-42dc-b064-bb905f0b2e00-kube-api-access-tcltf\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbda6188-592c-42dc-b064-bb905f0b2e00-tmp\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0c73998-a3cf-46ac-88ee-04698be10974-tmp-dir\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2x2z\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.372937 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.372905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.373699 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.373015 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:45.373699 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.373038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0c73998-a3cf-46ac-88ee-04698be10974-config-volume\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.373699 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.373094 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:45.873076346 +0000 UTC m=+34.099284269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:23:45.373699 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.373366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.374446 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.374184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.374554 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.374490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cbda6188-592c-42dc-b064-bb905f0b2e00-tmp\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.374554 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.374516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c666ba21-097e-4fb4-ac00-f607e9a9198f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.374895 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.374952 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:45.874933576 +0000 UTC m=+34.101141503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.374950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c0c73998-a3cf-46ac-88ee-04698be10974-tmp-dir\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.375051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.375135 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:23:45.375181 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.375149 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:23:45.375520 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.375209 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:45.875190173 +0000 UTC m=+34.101398095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:23:45.377730 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.377709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.377885 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.377713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-ca\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.377972 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.377718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.377972 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.377729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.378171 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.378143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.378311 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.378291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c666ba21-097e-4fb4-ac00-f607e9a9198f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.379527 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.379408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cbda6188-592c-42dc-b064-bb905f0b2e00-klusterlet-config\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.379719 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.379659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/607b691a-53a8-4d6c-9a81-238041e2f614-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.381792 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.381436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcswx\" (UniqueName: \"kubernetes.io/projected/607b691a-53a8-4d6c-9a81-238041e2f614-kube-api-access-pcswx\") pod \"managed-serviceaccount-addon-agent-74c4768dd5-95lnn\" (UID: \"607b691a-53a8-4d6c-9a81-238041e2f614\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.381792 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.381447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4bb\" (UniqueName: \"kubernetes.io/projected/0dfe4973-64f0-41f7-a34e-6d35be53c155-kube-api-access-2f4bb\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.382162 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.382140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjzrn\" (UniqueName: \"kubernetes.io/projected/c666ba21-097e-4fb4-ac00-f607e9a9198f-kube-api-access-cjzrn\") pod \"cluster-proxy-proxy-agent-f6976c7d4-7xx9n\" (UID: \"c666ba21-097e-4fb4-ac00-f607e9a9198f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.382450 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.382426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skhpg\" (UniqueName: \"kubernetes.io/projected/c0c73998-a3cf-46ac-88ee-04698be10974-kube-api-access-skhpg\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.383700 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.383661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.384048 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.384027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2x2z\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.384685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.384664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcltf\" (UniqueName: \"kubernetes.io/projected/cbda6188-592c-42dc-b064-bb905f0b2e00-kube-api-access-tcltf\") pod \"klusterlet-addon-workmgr-768df5c47b-6jpfj\" (UID: \"cbda6188-592c-42dc-b064-bb905f0b2e00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.496842 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.496739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:45.521249 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.521217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:23:45.529021 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.528805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:23:45.659914 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.659881 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj"] Apr 20 16:23:45.664638 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:45.664604 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbda6188_592c_42dc_b064_bb905f0b2e00.slice/crio-96fcbeeeb470fe05964c4fa94646e48534cd970d03f2af10ef32d57f0ff95fd2 WatchSource:0}: Error finding container 96fcbeeeb470fe05964c4fa94646e48534cd970d03f2af10ef32d57f0ff95fd2: Status 404 returned error can't find the container with id 96fcbeeeb470fe05964c4fa94646e48534cd970d03f2af10ef32d57f0ff95fd2 Apr 20 16:23:45.677217 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.677185 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn"] Apr 20 16:23:45.680629 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:45.680598 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod607b691a_53a8_4d6c_9a81_238041e2f614.slice/crio-7b38aa13999b840d9c31cf39775385c8f6cbc4cd8e90ba73492628b279133ae4 WatchSource:0}: Error finding container 7b38aa13999b840d9c31cf39775385c8f6cbc4cd8e90ba73492628b279133ae4: Status 404 returned error can't find the container with id 7b38aa13999b840d9c31cf39775385c8f6cbc4cd8e90ba73492628b279133ae4 Apr 20 16:23:45.699792 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.699747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n"] Apr 20 16:23:45.703378 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:45.703341 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc666ba21_097e_4fb4_ac00_f607e9a9198f.slice/crio-0ad105c1eb660767c0ac9a9478d1bb1166e34bd9d0189d9309f143cead0c8705 WatchSource:0}: Error finding container 0ad105c1eb660767c0ac9a9478d1bb1166e34bd9d0189d9309f143cead0c8705: Status 404 returned error can't find the container with id 0ad105c1eb660767c0ac9a9478d1bb1166e34bd9d0189d9309f143cead0c8705 Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.890131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.890192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.890249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890371 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890432 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:46.890412875 +0000 UTC m=+35.116620815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890851 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890900 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:46.890882285 +0000 UTC m=+35.117090205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890957 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890969 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:23:45.891177 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.890998 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:46.890988541 +0000 UTC m=+35.117196475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:23:45.991397 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:45.991354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:45.991547 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.991454 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:45.991547 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:45.991523 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:17.991503587 +0000 UTC m=+66.217711507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 16:23:46.092050 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.092012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:46.092236 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.092182 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 16:23:46.092236 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.092208 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 16:23:46.092236 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.092220 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ln67g for pod openshift-network-diagnostics/network-check-target-cq4h4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:46.092371 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.092280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g podName:3b741c5a-ce22-4075-86e3-0c1155e94215 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:18.092262168 +0000 UTC m=+66.318470092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ln67g" (UniqueName: "kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g") pod "network-check-target-cq4h4" (UID: "3b741c5a-ce22-4075-86e3-0c1155e94215") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 16:23:46.358835 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.358649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:23:46.359307 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.359188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:46.359448 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.359435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:23:46.362085 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.361785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:23:46.362085 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.361926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 16:23:46.362778 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.362687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:23:46.362778 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.362727 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:23:46.362918 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.362691 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:23:46.363253 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.363233 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:23:46.507563 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.507523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerStarted","Data":"0ad105c1eb660767c0ac9a9478d1bb1166e34bd9d0189d9309f143cead0c8705"} Apr 20 16:23:46.509269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.509234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" event={"ID":"607b691a-53a8-4d6c-9a81-238041e2f614","Type":"ContainerStarted","Data":"7b38aa13999b840d9c31cf39775385c8f6cbc4cd8e90ba73492628b279133ae4"} Apr 20 16:23:46.511694 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.511667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" event={"ID":"cbda6188-592c-42dc-b064-bb905f0b2e00","Type":"ContainerStarted","Data":"96fcbeeeb470fe05964c4fa94646e48534cd970d03f2af10ef32d57f0ff95fd2"} Apr 20 16:23:46.899243 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.899202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:46.899442 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.899320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:46.899442 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899357 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:46.899442 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:46.899376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:46.899442 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:48.89941479 +0000 UTC m=+37.125622711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:23:46.899670 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899497 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:23:46.899670 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899510 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:23:46.899670 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899554 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:48.899542118 +0000 UTC m=+37.125750052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:23:46.899670 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899604 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:46.899670 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:46.899630 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:48.899621686 +0000 UTC m=+37.125829606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:23:48.013377 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.013333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:48.020988 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.020954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09560999-6ebd-4da5-b805-d700919dfb04-original-pull-secret\") pod \"global-pull-secret-syncer-28hkj\" (UID: \"09560999-6ebd-4da5-b805-d700919dfb04\") " pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:48.201148 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.201111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28hkj" Apr 20 16:23:48.922689 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.922647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:48.922887 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.922714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:48.922887 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:48.922794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:48.922887 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922830 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:48.922887 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922842 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:23:48.922887 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922860 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:23:48.923310 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922903 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:52.92288241 +0000 UTC m=+41.149090337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:23:48.923310 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:23:52.922911677 +0000 UTC m=+41.149119601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:23:48.923310 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922927 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:48.923310 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:48.922981 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:23:52.922963485 +0000 UTC m=+41.149171418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:23:52.958200 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:52.958144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:52.958220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:52.958272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958288 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:00.958339783 +0000 UTC m=+49.184547703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958367 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958384 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958392 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:00.958411548 +0000 UTC m=+49.184619467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:23:52.958717 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:23:52.958449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:00.958439978 +0000 UTC m=+49.184647904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:23:54.015444 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.015310 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28hkj"] Apr 20 16:23:54.020252 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:23:54.020219 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09560999_6ebd_4da5_b805_d700919dfb04.slice/crio-8166213941a40ee166189bf36cd5024efe920e5a73648cdd601b2ce6b73f8aca WatchSource:0}: Error finding container 8166213941a40ee166189bf36cd5024efe920e5a73648cdd601b2ce6b73f8aca: Status 404 returned error can't find the container with id 8166213941a40ee166189bf36cd5024efe920e5a73648cdd601b2ce6b73f8aca Apr 20 16:23:54.531562 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.531524 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="83494a7a86eecb99995dbd6f40c3048a6410f93180da1c8475c9bea036c39ddc" exitCode=0 Apr 20 16:23:54.531777 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.531598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"83494a7a86eecb99995dbd6f40c3048a6410f93180da1c8475c9bea036c39ddc"} Apr 20 16:23:54.532867 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.532844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerStarted","Data":"0350728e7c3d7055a2b9a246bd51401e0fd0d06169c71229b747c5bb3b1cfb4f"} Apr 20 16:23:54.534220 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.534195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" event={"ID":"607b691a-53a8-4d6c-9a81-238041e2f614","Type":"ContainerStarted","Data":"338c5ba47b1f1fbbc4214f06d474391fb1d62353e0709956e8bd280ed1e10405"} Apr 20 16:23:54.535504 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.535484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" event={"ID":"cbda6188-592c-42dc-b064-bb905f0b2e00","Type":"ContainerStarted","Data":"941bd8a67e426227b1b06393c004db51e7a9bacc0f9bf9541e1b07c6d18cdd9d"} Apr 20 16:23:54.535689 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.535675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:54.536565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.536546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28hkj" event={"ID":"09560999-6ebd-4da5-b805-d700919dfb04","Type":"ContainerStarted","Data":"8166213941a40ee166189bf36cd5024efe920e5a73648cdd601b2ce6b73f8aca"} Apr 20 16:23:54.537562 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.537541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:23:54.570662 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.568286 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" podStartSLOduration=8.327830827 podStartE2EDuration="16.568268278s" podCreationTimestamp="2026-04-20 16:23:38 +0000 UTC" firstStartedPulling="2026-04-20 16:23:45.66680602 +0000 UTC m=+33.893013939" lastFinishedPulling="2026-04-20 16:23:53.907243471 +0000 UTC m=+42.133451390" observedRunningTime="2026-04-20 16:23:54.566522676 +0000 UTC m=+42.792730618" watchObservedRunningTime="2026-04-20 16:23:54.568268278 +0000 UTC m=+42.794476221" Apr 20 16:23:54.584932 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:54.584875 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" podStartSLOduration=8.376841145 podStartE2EDuration="16.584857853s" podCreationTimestamp="2026-04-20 16:23:38 +0000 UTC" firstStartedPulling="2026-04-20 16:23:45.683083997 +0000 UTC m=+33.909291930" lastFinishedPulling="2026-04-20 16:23:53.891100715 +0000 UTC m=+42.117308638" observedRunningTime="2026-04-20 16:23:54.584093172 +0000 UTC m=+42.810301115" watchObservedRunningTime="2026-04-20 16:23:54.584857853 +0000 UTC m=+42.811065795" Apr 20 16:23:55.542067 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:55.542028 2576 generic.go:358] "Generic (PLEG): container finished" podID="1bb216ed-aa87-4017-b000-0f3d37d1fda9" containerID="2d143ef5911d78a4dee8bb13d5efaa76c34ca442b87d24e6d1448d81cdf3c532" exitCode=0 Apr 20 16:23:55.542530 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:55.542126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerDied","Data":"2d143ef5911d78a4dee8bb13d5efaa76c34ca442b87d24e6d1448d81cdf3c532"} Apr 20 16:23:56.546770 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:56.546718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" event={"ID":"1bb216ed-aa87-4017-b000-0f3d37d1fda9","Type":"ContainerStarted","Data":"115556b6e92fd3ed5f42a0abf7e325cba057f280631ad69300b9aa7689d447e3"} Apr 20 16:23:56.573885 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:56.573835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wsgnn" podStartSLOduration=5.867069201 podStartE2EDuration="44.573819207s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:23:15.183204806 +0000 UTC m=+3.409412725" lastFinishedPulling="2026-04-20 16:23:53.889954811 +0000 UTC m=+42.116162731" observedRunningTime="2026-04-20 16:23:56.573121373 +0000 UTC m=+44.799329313" watchObservedRunningTime="2026-04-20 16:23:56.573819207 +0000 UTC m=+44.800027149" Apr 20 16:23:57.550965 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:57.550925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerStarted","Data":"c610a907bf2a3a24fafa074c1c5397629a55fdafe3db1da89a97af0483423008"} Apr 20 16:23:58.554965 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:58.554921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerStarted","Data":"a8621c1faee015c365989cee93e2e5c9f54cc868b86cd127c66f6cda818c9838"} Apr 20 16:23:58.573083 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:58.573031 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" podStartSLOduration=9.46934007 podStartE2EDuration="20.573017913s" podCreationTimestamp="2026-04-20 16:23:38 +0000 UTC" firstStartedPulling="2026-04-20 16:23:45.705565662 +0000 UTC m=+33.931773582" lastFinishedPulling="2026-04-20 16:23:56.809243505 +0000 UTC m=+45.035451425" observedRunningTime="2026-04-20 16:23:58.571094553 +0000 UTC m=+46.797302494" watchObservedRunningTime="2026-04-20 16:23:58.573017913 +0000 UTC m=+46.799225855" Apr 20 16:23:59.560141 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:59.560102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28hkj" event={"ID":"09560999-6ebd-4da5-b805-d700919dfb04","Type":"ContainerStarted","Data":"2c6da81bf5d850b42041a3d61f68c865e411881407daf695b3ac216d790203a2"} Apr 20 16:23:59.575482 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:23:59.575431 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-28hkj" podStartSLOduration=38.947352152 podStartE2EDuration="43.575414486s" podCreationTimestamp="2026-04-20 16:23:16 +0000 UTC" firstStartedPulling="2026-04-20 16:23:54.022861646 +0000 UTC m=+42.249069569" lastFinishedPulling="2026-04-20 16:23:58.650923973 +0000 UTC m=+46.877131903" observedRunningTime="2026-04-20 16:23:59.574916737 +0000 UTC m=+47.801124679" watchObservedRunningTime="2026-04-20 16:23:59.575414486 +0000 UTC m=+47.801622428" Apr 20 16:24:01.021927 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:01.021889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:24:01.021927 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:01.021933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:01.021966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022041 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022047 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022076 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022094 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:17.0220739 +0000 UTC m=+65.248281821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:17.0221085 +0000 UTC m=+65.248316419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022047 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:01.022337 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:01.022150 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:17.022144289 +0000 UTC m=+65.248352209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:24:12.506672 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:12.506640 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s95ld" Apr 20 16:24:17.038460 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:17.038419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:17.038467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:17.038504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038579 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038634 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038650 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038661 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:49.038639812 +0000 UTC m=+97.264847753 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038584 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038700 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:24:49.03868781 +0000 UTC m=+97.264895735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:24:17.038875 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:17.038735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:24:49.038710278 +0000 UTC m=+97.264918198 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:24:18.046508 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.046448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:24:18.049269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.049248 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 16:24:18.057426 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:18.057402 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:24:18.057484 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:18.057472 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:22.057454537 +0000 UTC m=+130.283662457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : secret "metrics-daemon-secret" not found Apr 20 16:24:18.147769 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.147722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:24:18.150840 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.150820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 16:24:18.161201 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.161175 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 16:24:18.172183 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.172150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln67g\" (UniqueName: \"kubernetes.io/projected/3b741c5a-ce22-4075-86e3-0c1155e94215-kube-api-access-ln67g\") pod \"network-check-target-cq4h4\" (UID: \"3b741c5a-ce22-4075-86e3-0c1155e94215\") " pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:24:18.176276 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.176258 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hv7wf\"" Apr 20 16:24:18.184314 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.184296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:24:18.300349 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.300273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cq4h4"] Apr 20 16:24:18.303011 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:24:18.302979 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b741c5a_ce22_4075_86e3_0c1155e94215.slice/crio-9d291606b1bcae2e9d4fa8bd0724264a01064072ca7415c46c0f638c8e91361f WatchSource:0}: Error finding container 9d291606b1bcae2e9d4fa8bd0724264a01064072ca7415c46c0f638c8e91361f: Status 404 returned error can't find the container with id 9d291606b1bcae2e9d4fa8bd0724264a01064072ca7415c46c0f638c8e91361f Apr 20 16:24:18.609959 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:18.609869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cq4h4" event={"ID":"3b741c5a-ce22-4075-86e3-0c1155e94215","Type":"ContainerStarted","Data":"9d291606b1bcae2e9d4fa8bd0724264a01064072ca7415c46c0f638c8e91361f"} Apr 20 16:24:22.623582 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:22.623542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cq4h4" event={"ID":"3b741c5a-ce22-4075-86e3-0c1155e94215","Type":"ContainerStarted","Data":"97a4565abbb81f2a2b1b0bd5728a297d676a0f9ceb57f7caf9fcb725ae080893"} Apr 20 16:24:22.624011 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:22.623754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:24:22.639456 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:22.639404 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cq4h4" podStartSLOduration=67.378816322 podStartE2EDuration="1m10.639389823s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:24:18.304995188 +0000 UTC m=+66.531203114" lastFinishedPulling="2026-04-20 16:24:21.565568695 +0000 UTC m=+69.791776615" observedRunningTime="2026-04-20 16:24:22.637926409 +0000 UTC m=+70.864134362" watchObservedRunningTime="2026-04-20 16:24:22.639389823 +0000 UTC m=+70.865597764" Apr 20 16:24:49.074383 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:49.074243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:24:49.074383 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:49.074304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:24:49.074383 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:49.074367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074390 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074416 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6556f6497c-5xzdw: secret "image-registry-tls" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074460 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074460 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074473 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls podName:b677bf42-4f9a-44b2-8c42-7b22242cad9b nodeName:}" failed. No retries permitted until 2026-04-20 16:25:53.074456071 +0000 UTC m=+161.300663995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls") pod "image-registry-6556f6497c-5xzdw" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b") : secret "image-registry-tls" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert podName:0dfe4973-64f0-41f7-a34e-6d35be53c155 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:53.074513103 +0000 UTC m=+161.300721028 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert") pod "ingress-canary-7f8db" (UID: "0dfe4973-64f0-41f7-a34e-6d35be53c155") : secret "canary-serving-cert" not found Apr 20 16:24:49.074979 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:24:49.074547 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls podName:c0c73998-a3cf-46ac-88ee-04698be10974 nodeName:}" failed. No retries permitted until 2026-04-20 16:25:53.074539888 +0000 UTC m=+161.300747807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls") pod "dns-default-t2rgz" (UID: "c0c73998-a3cf-46ac-88ee-04698be10974") : secret "dns-default-metrics-tls" not found Apr 20 16:24:53.629152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:24:53.629122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cq4h4" Apr 20 16:25:22.113493 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:22.113452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:25:22.113999 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:22.113615 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 16:25:22.113999 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:22.113686 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs podName:7948c105-68aa-437a-a0ac-fa0d535c7b37 nodeName:}" failed. No retries permitted until 2026-04-20 16:27:24.11366866 +0000 UTC m=+252.339876580 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs") pod "network-metrics-daemon-tr5xd" (UID: "7948c105-68aa-437a-a0ac-fa0d535c7b37") : secret "metrics-daemon-secret" not found Apr 20 16:25:40.826156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:40.826126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tnl7m_0f424042-eb12-467e-85c1-cbdd302c3e4d/dns-node-resolver/0.log" Apr 20 16:25:41.826159 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:41.826127 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wn2xt_bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7/node-ca/0.log" Apr 20 16:25:48.205989 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:48.205923 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" Apr 20 16:25:48.252279 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:48.252247 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t2rgz" podUID="c0c73998-a3cf-46ac-88ee-04698be10974" Apr 20 16:25:48.258410 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:48.258376 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7f8db" podUID="0dfe4973-64f0-41f7-a34e-6d35be53c155" Apr 20 16:25:48.821260 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:48.821230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:25:48.821444 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:48.821264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:25:48.821444 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:48.821377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t2rgz" Apr 20 16:25:49.408698 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:25:49.408639 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tr5xd" podUID="7948c105-68aa-437a-a0ac-fa0d535c7b37" Apr 20 16:25:53.141093 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.141052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:25:53.141093 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.141105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:25:53.141528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.141136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:25:53.143561 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.143531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfe4973-64f0-41f7-a34e-6d35be53c155-cert\") pod \"ingress-canary-7f8db\" (UID: \"0dfe4973-64f0-41f7-a34e-6d35be53c155\") " pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:25:53.143680 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.143564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"image-registry-6556f6497c-5xzdw\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:25:53.143719 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.143677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0c73998-a3cf-46ac-88ee-04698be10974-metrics-tls\") pod \"dns-default-t2rgz\" (UID: \"c0c73998-a3cf-46ac-88ee-04698be10974\") " pod="openshift-dns/dns-default-t2rgz" Apr 20 16:25:53.325386 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.325358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n225r\"" Apr 20 16:25:53.325565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.325359 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rqfxd\"" Apr 20 16:25:53.325565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.325428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cmjpg\"" Apr 20 16:25:53.332523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.332503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t2rgz" Apr 20 16:25:53.332580 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.332532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:25:53.332580 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.332546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7f8db" Apr 20 16:25:53.471190 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.471106 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7f8db"] Apr 20 16:25:53.473852 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:25:53.473824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dfe4973_64f0_41f7_a34e_6d35be53c155.slice/crio-13dc9baa159dc76c2a911836d3b7476df0a922ade1223367b329063ac3a4d661 WatchSource:0}: Error finding container 13dc9baa159dc76c2a911836d3b7476df0a922ade1223367b329063ac3a4d661: Status 404 returned error can't find the container with id 13dc9baa159dc76c2a911836d3b7476df0a922ade1223367b329063ac3a4d661 Apr 20 16:25:53.699062 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.698976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t2rgz"] Apr 20 16:25:53.701329 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.701293 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:25:53.702091 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:25:53.702060 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c73998_a3cf_46ac_88ee_04698be10974.slice/crio-d3570f8191e89d3f95d43a5649da3cdcf8739c5b2efed331c1cb12f8b25f90d4 WatchSource:0}: Error finding container d3570f8191e89d3f95d43a5649da3cdcf8739c5b2efed331c1cb12f8b25f90d4: Status 404 returned error can't find the container with id d3570f8191e89d3f95d43a5649da3cdcf8739c5b2efed331c1cb12f8b25f90d4 Apr 20 16:25:53.704173 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:25:53.704151 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb677bf42_4f9a_44b2_8c42_7b22242cad9b.slice/crio-3b139a8fc104ee62e890420690a2ad84d22463435ec651d4cce2a55ea0b4dfd0 WatchSource:0}: Error finding container 3b139a8fc104ee62e890420690a2ad84d22463435ec651d4cce2a55ea0b4dfd0: Status 404 returned error can't find the container with id 3b139a8fc104ee62e890420690a2ad84d22463435ec651d4cce2a55ea0b4dfd0 Apr 20 16:25:53.833572 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.833513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7f8db" event={"ID":"0dfe4973-64f0-41f7-a34e-6d35be53c155","Type":"ContainerStarted","Data":"13dc9baa159dc76c2a911836d3b7476df0a922ade1223367b329063ac3a4d661"} Apr 20 16:25:53.834484 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.834457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2rgz" event={"ID":"c0c73998-a3cf-46ac-88ee-04698be10974","Type":"ContainerStarted","Data":"d3570f8191e89d3f95d43a5649da3cdcf8739c5b2efed331c1cb12f8b25f90d4"} Apr 20 16:25:53.835652 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.835628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" event={"ID":"b677bf42-4f9a-44b2-8c42-7b22242cad9b","Type":"ContainerStarted","Data":"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585"} Apr 20 16:25:53.835750 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.835655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" event={"ID":"b677bf42-4f9a-44b2-8c42-7b22242cad9b","Type":"ContainerStarted","Data":"3b139a8fc104ee62e890420690a2ad84d22463435ec651d4cce2a55ea0b4dfd0"} Apr 20 16:25:53.835820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.835773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:25:53.854085 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:53.854042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" podStartSLOduration=161.854026531 podStartE2EDuration="2m41.854026531s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:25:53.853368968 +0000 UTC m=+162.079576908" watchObservedRunningTime="2026-04-20 16:25:53.854026531 +0000 UTC m=+162.080234469" Apr 20 16:25:54.536587 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.536469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" podUID="cbda6188-592c-42dc-b064-bb905f0b2e00" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.6:8000/readyz\": dial tcp 10.132.0.6:8000: connect: connection refused" Apr 20 16:25:54.840449 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.840362 2576 generic.go:358] "Generic (PLEG): container finished" podID="607b691a-53a8-4d6c-9a81-238041e2f614" containerID="338c5ba47b1f1fbbc4214f06d474391fb1d62353e0709956e8bd280ed1e10405" exitCode=255 Apr 20 16:25:54.840611 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.840445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" event={"ID":"607b691a-53a8-4d6c-9a81-238041e2f614","Type":"ContainerDied","Data":"338c5ba47b1f1fbbc4214f06d474391fb1d62353e0709956e8bd280ed1e10405"} Apr 20 16:25:54.840822 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.840804 2576 scope.go:117] "RemoveContainer" containerID="338c5ba47b1f1fbbc4214f06d474391fb1d62353e0709956e8bd280ed1e10405" Apr 20 16:25:54.841915 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.841887 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbda6188-592c-42dc-b064-bb905f0b2e00" containerID="941bd8a67e426227b1b06393c004db51e7a9bacc0f9bf9541e1b07c6d18cdd9d" exitCode=1 Apr 20 16:25:54.842014 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.841931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" event={"ID":"cbda6188-592c-42dc-b064-bb905f0b2e00","Type":"ContainerDied","Data":"941bd8a67e426227b1b06393c004db51e7a9bacc0f9bf9541e1b07c6d18cdd9d"} Apr 20 16:25:54.842453 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:54.842433 2576 scope.go:117] "RemoveContainer" containerID="941bd8a67e426227b1b06393c004db51e7a9bacc0f9bf9541e1b07c6d18cdd9d" Apr 20 16:25:55.497161 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.497139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:25:55.521991 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.521970 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" Apr 20 16:25:55.847729 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.847692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" event={"ID":"cbda6188-592c-42dc-b064-bb905f0b2e00","Type":"ContainerStarted","Data":"07f9362f3c0eabca03cf3f1b37e7c0d7bc4688bd4fe78bac6f515a39a18a548d"} Apr 20 16:25:55.848233 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.847870 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:25:55.848541 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.848519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-768df5c47b-6jpfj" Apr 20 16:25:55.849100 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.849079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7f8db" event={"ID":"0dfe4973-64f0-41f7-a34e-6d35be53c155","Type":"ContainerStarted","Data":"36052780069e95f37c1444eb9ccb5e858b97d46f62f0fd8c27fce60dfef688e5"} Apr 20 16:25:55.850522 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.850501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2rgz" event={"ID":"c0c73998-a3cf-46ac-88ee-04698be10974","Type":"ContainerStarted","Data":"a475aff3a3fb2348513c8f9ec59f6f6004a9610f21194f98e5d3196ea25eba64"} Apr 20 16:25:55.850628 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.850525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2rgz" event={"ID":"c0c73998-a3cf-46ac-88ee-04698be10974","Type":"ContainerStarted","Data":"c00f1e7c8e753ff0d5dbc1b5b8d6b59474b214f283afff8a3a3c7c1504dcb331"} Apr 20 16:25:55.850628 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.850559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t2rgz" Apr 20 16:25:55.851887 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.851864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74c4768dd5-95lnn" event={"ID":"607b691a-53a8-4d6c-9a81-238041e2f614","Type":"ContainerStarted","Data":"fb5107919890065a4c0d0ea3f9eeae1cb92971c752342205a0ca87ce3ff5ef93"} Apr 20 16:25:55.906137 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.906077 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t2rgz" podStartSLOduration=129.100621281 podStartE2EDuration="2m10.906060358s" podCreationTimestamp="2026-04-20 16:23:45 +0000 UTC" firstStartedPulling="2026-04-20 16:25:53.70416834 +0000 UTC m=+161.930376260" lastFinishedPulling="2026-04-20 16:25:55.50960741 +0000 UTC m=+163.735815337" observedRunningTime="2026-04-20 16:25:55.90502884 +0000 UTC m=+164.131236780" watchObservedRunningTime="2026-04-20 16:25:55.906060358 +0000 UTC m=+164.132268299" Apr 20 16:25:55.925745 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:25:55.925696 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7f8db" podStartSLOduration=128.891772414 podStartE2EDuration="2m10.925681303s" podCreationTimestamp="2026-04-20 16:23:45 +0000 UTC" firstStartedPulling="2026-04-20 16:25:53.475701693 +0000 UTC m=+161.701909630" lastFinishedPulling="2026-04-20 16:25:55.509610585 +0000 UTC m=+163.735818519" observedRunningTime="2026-04-20 16:25:55.925061346 +0000 UTC m=+164.151269288" watchObservedRunningTime="2026-04-20 16:25:55.925681303 +0000 UTC m=+164.151889245" Apr 20 16:26:00.358394 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:00.358349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:26:01.244180 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.244144 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7jzmt"] Apr 20 16:26:01.247503 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.247479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.251294 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.251271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 16:26:01.251294 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.251272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lgcvb\"" Apr 20 16:26:01.251471 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.251271 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 16:26:01.251471 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.251271 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 16:26:01.251471 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.251278 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 16:26:01.257945 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.257923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7jzmt"] Apr 20 16:26:01.308644 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.308614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ec95865-41d4-4612-acad-c1c0a5433c03-crio-socket\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.308817 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.308653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.308817 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.308742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bprmz\" (UniqueName: \"kubernetes.io/projected/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-api-access-bprmz\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.308817 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.308796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ec95865-41d4-4612-acad-c1c0a5433c03-data-volume\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.308922 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.308826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ec95865-41d4-4612-acad-c1c0a5433c03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409340 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bprmz\" (UniqueName: \"kubernetes.io/projected/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-api-access-bprmz\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ec95865-41d4-4612-acad-c1c0a5433c03-data-volume\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ec95865-41d4-4612-acad-c1c0a5433c03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ec95865-41d4-4612-acad-c1c0a5433c03-crio-socket\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ec95865-41d4-4612-acad-c1c0a5433c03-crio-socket\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.409814 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ec95865-41d4-4612-acad-c1c0a5433c03-data-volume\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.410077 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.409850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.411796 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.411773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ec95865-41d4-4612-acad-c1c0a5433c03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.419486 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.419464 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bprmz\" (UniqueName: \"kubernetes.io/projected/7ec95865-41d4-4612-acad-c1c0a5433c03-kube-api-access-bprmz\") pod \"insights-runtime-extractor-7jzmt\" (UID: \"7ec95865-41d4-4612-acad-c1c0a5433c03\") " pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.556440 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.556398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7jzmt" Apr 20 16:26:01.672712 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.672677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7jzmt"] Apr 20 16:26:01.677163 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:26:01.677135 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec95865_41d4_4612_acad_c1c0a5433c03.slice/crio-eb412736edae0cf39958fbb2ed426375295069fa9ea06ada7d18fbacaa10c1c1 WatchSource:0}: Error finding container eb412736edae0cf39958fbb2ed426375295069fa9ea06ada7d18fbacaa10c1c1: Status 404 returned error can't find the container with id eb412736edae0cf39958fbb2ed426375295069fa9ea06ada7d18fbacaa10c1c1 Apr 20 16:26:01.870896 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.870811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7jzmt" event={"ID":"7ec95865-41d4-4612-acad-c1c0a5433c03","Type":"ContainerStarted","Data":"e2b0056242318ff312b6786cf19d39a9e26acd892f3c83082357f11d3d380679"} Apr 20 16:26:01.870896 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:01.870847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7jzmt" event={"ID":"7ec95865-41d4-4612-acad-c1c0a5433c03","Type":"ContainerStarted","Data":"eb412736edae0cf39958fbb2ed426375295069fa9ea06ada7d18fbacaa10c1c1"} Apr 20 16:26:02.875282 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:02.875238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7jzmt" event={"ID":"7ec95865-41d4-4612-acad-c1c0a5433c03","Type":"ContainerStarted","Data":"b2b92eed5dbd26b344eb7af09d2a327a07f51e42b2c7c4e29ee6d3ca95d41bb9"} Apr 20 16:26:03.879415 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:03.879325 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7jzmt" event={"ID":"7ec95865-41d4-4612-acad-c1c0a5433c03","Type":"ContainerStarted","Data":"5157cf1e59a7490c57e0893898540d612e5ddea431e98452884e89352ecba6e4"} Apr 20 16:26:03.896712 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:03.896666 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7jzmt" podStartSLOduration=1.040840114 podStartE2EDuration="2.896652276s" podCreationTimestamp="2026-04-20 16:26:01 +0000 UTC" firstStartedPulling="2026-04-20 16:26:01.736047028 +0000 UTC m=+169.962254952" lastFinishedPulling="2026-04-20 16:26:03.591859195 +0000 UTC m=+171.818067114" observedRunningTime="2026-04-20 16:26:03.896209661 +0000 UTC m=+172.122417602" watchObservedRunningTime="2026-04-20 16:26:03.896652276 +0000 UTC m=+172.122860258" Apr 20 16:26:05.858015 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:05.857984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t2rgz" Apr 20 16:26:13.336268 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:13.336227 2576 patch_prober.go:28] interesting pod/image-registry-6556f6497c-5xzdw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 16:26:13.336654 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:13.336291 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 16:26:14.845916 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:14.845889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:26:21.737868 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.737832 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-99k55"] Apr 20 16:26:21.742956 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.742934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.745485 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.745443 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 16:26:21.745651 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.745537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 16:26:21.745713 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.745645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 16:26:21.746908 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.746862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pd26p\"" Apr 20 16:26:21.746908 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.746881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 16:26:21.747105 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.746960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 16:26:21.747105 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.746979 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 16:26:21.881824 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.881824 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-sys\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-textfile\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-wtmp\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-metrics-client-ca\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.881986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkct\" (UniqueName: \"kubernetes.io/projected/f2193334-65a2-4f35-bbed-0117cbe5d424-kube-api-access-rvkct\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.882033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-accelerators-collector-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.882058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-root\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.882080 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.882076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983206 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-root\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983206 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983402 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-root\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983402 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:26:21.983337 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 16:26:21.983402 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983402 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:26:21.983382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls podName:f2193334-65a2-4f35-bbed-0117cbe5d424 nodeName:}" failed. No retries permitted until 2026-04-20 16:26:22.483368383 +0000 UTC m=+190.709576303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls") pod "node-exporter-99k55" (UID: "f2193334-65a2-4f35-bbed-0117cbe5d424") : secret "node-exporter-tls" not found Apr 20 16:26:21.983550 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-sys\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983550 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-textfile\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983550 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-wtmp\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983550 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-metrics-client-ca\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983550 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-sys\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983813 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-wtmp\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983813 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkct\" (UniqueName: \"kubernetes.io/projected/f2193334-65a2-4f35-bbed-0117cbe5d424-kube-api-access-rvkct\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983813 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-accelerators-collector-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.983935 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.983836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-textfile\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.984074 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.984057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-metrics-client-ca\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.984156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.984137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-accelerators-collector-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.986166 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.986142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:21.995063 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:21.994996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkct\" (UniqueName: \"kubernetes.io/projected/f2193334-65a2-4f35-bbed-0117cbe5d424-kube-api-access-rvkct\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:22.487904 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:22.487860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:22.490199 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:22.490163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2193334-65a2-4f35-bbed-0117cbe5d424-node-exporter-tls\") pod \"node-exporter-99k55\" (UID: \"f2193334-65a2-4f35-bbed-0117cbe5d424\") " pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:22.652470 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:22.652430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-99k55" Apr 20 16:26:22.660431 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:26:22.660402 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2193334_65a2_4f35_bbed_0117cbe5d424.slice/crio-fe0cf31d3b7b2638902878f908782df4743d08d70bcab2137e8b272a7a6de80a WatchSource:0}: Error finding container fe0cf31d3b7b2638902878f908782df4743d08d70bcab2137e8b272a7a6de80a: Status 404 returned error can't find the container with id fe0cf31d3b7b2638902878f908782df4743d08d70bcab2137e8b272a7a6de80a Apr 20 16:26:22.927152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:22.927119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-99k55" event={"ID":"f2193334-65a2-4f35-bbed-0117cbe5d424","Type":"ContainerStarted","Data":"fe0cf31d3b7b2638902878f908782df4743d08d70bcab2137e8b272a7a6de80a"} Apr 20 16:26:23.516314 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:23.516287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:26:23.930897 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:23.930863 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2193334-65a2-4f35-bbed-0117cbe5d424" containerID="25e208a5440f72c5aa2c7c3ffba44a44c04cc21e22464394c97281069eeaa28f" exitCode=0 Apr 20 16:26:23.931269 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:23.930903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-99k55" event={"ID":"f2193334-65a2-4f35-bbed-0117cbe5d424","Type":"ContainerDied","Data":"25e208a5440f72c5aa2c7c3ffba44a44c04cc21e22464394c97281069eeaa28f"} Apr 20 16:26:24.935729 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:24.935689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-99k55" event={"ID":"f2193334-65a2-4f35-bbed-0117cbe5d424","Type":"ContainerStarted","Data":"68c623990904ed3f1d5ae02e8c3923f2e869775e794e10aab238ad51e700aedf"} Apr 20 16:26:24.936149 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:24.935738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-99k55" event={"ID":"f2193334-65a2-4f35-bbed-0117cbe5d424","Type":"ContainerStarted","Data":"3f4dcac8cc17040210b2e11a0e5c29431e315f18b6b688472b5d4b2222c6bdaf"} Apr 20 16:26:24.955214 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:24.955160 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-99k55" podStartSLOduration=3.130861573 podStartE2EDuration="3.955147521s" podCreationTimestamp="2026-04-20 16:26:21 +0000 UTC" firstStartedPulling="2026-04-20 16:26:22.662364635 +0000 UTC m=+190.888572555" lastFinishedPulling="2026-04-20 16:26:23.48665058 +0000 UTC m=+191.712858503" observedRunningTime="2026-04-20 16:26:24.953673916 +0000 UTC m=+193.179881883" watchObservedRunningTime="2026-04-20 16:26:24.955147521 +0000 UTC m=+193.181355498" Apr 20 16:26:35.529915 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:35.529874 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" podUID="c666ba21-097e-4fb4-ac00-f607e9a9198f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 16:26:45.529966 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:45.529921 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" podUID="c666ba21-097e-4fb4-ac00-f607e9a9198f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 16:26:48.536708 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.536645 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerName="registry" containerID="cri-o://14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585" gracePeriod=30 Apr 20 16:26:48.775278 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.775255 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:26:48.883207 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883108 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883207 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883207 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883176 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2x2z\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883207 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883294 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883343 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883374 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883407 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration\") pod \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\" (UID: \"b677bf42-4f9a-44b2-8c42-7b22242cad9b\") " Apr 20 16:26:48.883731 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:48.883858 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.883807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:26:48.885810 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.885773 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:48.885924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.885800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:48.885924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.885773 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:48.885924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.885881 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z" (OuterVolumeSpecName: "kube-api-access-s2x2z") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "kube-api-access-s2x2z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:26:48.886092 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.886049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 16:26:48.892640 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.892609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b677bf42-4f9a-44b2-8c42-7b22242cad9b" (UID: "b677bf42-4f9a-44b2-8c42-7b22242cad9b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:26:48.984678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984640 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-trusted-ca\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984672 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-tls\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984683 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b677bf42-4f9a-44b2-8c42-7b22242cad9b-registry-certificates\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984693 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b677bf42-4f9a-44b2-8c42-7b22242cad9b-ca-trust-extracted\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984704 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-image-registry-private-configuration\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984715 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b677bf42-4f9a-44b2-8c42-7b22242cad9b-installation-pull-secrets\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984724 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-bound-sa-token\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.984931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.984733 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2x2z\" (UniqueName: \"kubernetes.io/projected/b677bf42-4f9a-44b2-8c42-7b22242cad9b-kube-api-access-s2x2z\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:26:48.998691 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.998652 2576 generic.go:358] "Generic (PLEG): container finished" podID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerID="14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585" exitCode=0 Apr 20 16:26:48.998863 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.998703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" event={"ID":"b677bf42-4f9a-44b2-8c42-7b22242cad9b","Type":"ContainerDied","Data":"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585"} Apr 20 16:26:48.998863 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.998722 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" Apr 20 16:26:48.998863 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.998732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6556f6497c-5xzdw" event={"ID":"b677bf42-4f9a-44b2-8c42-7b22242cad9b","Type":"ContainerDied","Data":"3b139a8fc104ee62e890420690a2ad84d22463435ec651d4cce2a55ea0b4dfd0"} Apr 20 16:26:48.998863 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:48.998751 2576 scope.go:117] "RemoveContainer" containerID="14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585" Apr 20 16:26:49.006803 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:49.006785 2576 scope.go:117] "RemoveContainer" containerID="14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585" Apr 20 16:26:49.007082 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:26:49.007058 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585\": container with ID starting with 14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585 not found: ID does not exist" containerID="14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585" Apr 20 16:26:49.007135 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:49.007094 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585"} err="failed to get container status \"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585\": rpc error: code = NotFound desc = could not find container \"14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585\": container with ID starting with 14213399b4fbf42294d54bfaaef29abc4e09be4b98ca4bcf6eff54fd9d38f585 not found: ID does not exist" Apr 20 16:26:49.018160 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:49.018128 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:26:49.021518 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:49.021496 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6556f6497c-5xzdw"] Apr 20 16:26:50.362236 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:50.362202 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" path="/var/lib/kubelet/pods/b677bf42-4f9a-44b2-8c42-7b22242cad9b/volumes" Apr 20 16:26:55.529983 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:55.529942 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" podUID="c666ba21-097e-4fb4-ac00-f607e9a9198f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 16:26:55.530453 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:55.530010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" Apr 20 16:26:55.530494 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:55.530470 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a8621c1faee015c365989cee93e2e5c9f54cc868b86cd127c66f6cda818c9838"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 16:26:55.530528 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:55.530509 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" podUID="c666ba21-097e-4fb4-ac00-f607e9a9198f" containerName="service-proxy" containerID="cri-o://a8621c1faee015c365989cee93e2e5c9f54cc868b86cd127c66f6cda818c9838" gracePeriod=30 Apr 20 16:26:56.018403 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:56.018372 2576 generic.go:358] "Generic (PLEG): container finished" podID="c666ba21-097e-4fb4-ac00-f607e9a9198f" containerID="a8621c1faee015c365989cee93e2e5c9f54cc868b86cd127c66f6cda818c9838" exitCode=2 Apr 20 16:26:56.018565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:56.018442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerDied","Data":"a8621c1faee015c365989cee93e2e5c9f54cc868b86cd127c66f6cda818c9838"} Apr 20 16:26:56.018565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:26:56.018478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6976c7d4-7xx9n" event={"ID":"c666ba21-097e-4fb4-ac00-f607e9a9198f","Type":"ContainerStarted","Data":"d9b5bfb202f2a0c0a5e34c97239d879d2a4fe9e3477dd759e4ccd27fefba4263"} Apr 20 16:27:24.142912 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:24.142869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:27:24.145217 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:24.145191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7948c105-68aa-437a-a0ac-fa0d535c7b37-metrics-certs\") pod \"network-metrics-daemon-tr5xd\" (UID: \"7948c105-68aa-437a-a0ac-fa0d535c7b37\") " pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:27:24.361206 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:24.361180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xjxmn\"" Apr 20 16:27:24.369430 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:24.369403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5xd" Apr 20 16:27:24.486000 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:24.485970 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tr5xd"] Apr 20 16:27:24.489315 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:27:24.489286 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7948c105_68aa_437a_a0ac_fa0d535c7b37.slice/crio-e13a50eba0ae3205813611e252cb733b4b16b3a86a67d48280d2544965ae6e61 WatchSource:0}: Error finding container e13a50eba0ae3205813611e252cb733b4b16b3a86a67d48280d2544965ae6e61: Status 404 returned error can't find the container with id e13a50eba0ae3205813611e252cb733b4b16b3a86a67d48280d2544965ae6e61 Apr 20 16:27:25.095751 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:25.095710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5xd" event={"ID":"7948c105-68aa-437a-a0ac-fa0d535c7b37","Type":"ContainerStarted","Data":"e13a50eba0ae3205813611e252cb733b4b16b3a86a67d48280d2544965ae6e61"} Apr 20 16:27:26.099661 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:26.099625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5xd" event={"ID":"7948c105-68aa-437a-a0ac-fa0d535c7b37","Type":"ContainerStarted","Data":"5ebc750bb0b6657cb5fda4d76fec3551e75d5e431033f78fbaba482cc207aa6f"} Apr 20 16:27:26.099661 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:26.099661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5xd" event={"ID":"7948c105-68aa-437a-a0ac-fa0d535c7b37","Type":"ContainerStarted","Data":"7f3fc0c516abf9e954ef2959311053bbd6b1204c4e9088118715f0ec7bc9c1fe"} Apr 20 16:27:26.115625 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:27:26.115574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tr5xd" podStartSLOduration=252.917151894 podStartE2EDuration="4m14.115556655s" podCreationTimestamp="2026-04-20 16:23:12 +0000 UTC" firstStartedPulling="2026-04-20 16:27:24.491159709 +0000 UTC m=+252.717367633" lastFinishedPulling="2026-04-20 16:27:25.689564409 +0000 UTC m=+253.915772394" observedRunningTime="2026-04-20 16:27:26.114010379 +0000 UTC m=+254.340218321" watchObservedRunningTime="2026-04-20 16:27:26.115556655 +0000 UTC m=+254.341764596" Apr 20 16:28:12.234095 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:28:12.234064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:28:12.234636 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:28:12.234104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:28:12.239172 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:28:12.239145 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 16:29:14.306258 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.306163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp"] Apr 20 16:29:14.306655 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.306446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerName="registry" Apr 20 16:29:14.306655 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.306462 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerName="registry" Apr 20 16:29:14.306655 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.306513 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b677bf42-4f9a-44b2-8c42-7b22242cad9b" containerName="registry" Apr 20 16:29:14.309156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.309139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.311607 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.311585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 16:29:14.311607 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.311604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 16:29:14.312617 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.312604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-h876t\"" Apr 20 16:29:14.319024 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.318999 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp"] Apr 20 16:29:14.405818 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.405780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmmf\" (UniqueName: \"kubernetes.io/projected/91aea935-20f3-4802-8e93-6603088be733-kube-api-access-6jmmf\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.405818 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.405819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91aea935-20f3-4802-8e93-6603088be733-tmp\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.506377 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.506339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmmf\" (UniqueName: \"kubernetes.io/projected/91aea935-20f3-4802-8e93-6603088be733-kube-api-access-6jmmf\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.506377 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.506374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91aea935-20f3-4802-8e93-6603088be733-tmp\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.506721 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.506703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91aea935-20f3-4802-8e93-6603088be733-tmp\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.514468 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.514435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmmf\" (UniqueName: \"kubernetes.io/projected/91aea935-20f3-4802-8e93-6603088be733-kube-api-access-6jmmf\") pod \"openshift-lws-operator-bfc7f696d-k4zbp\" (UID: \"91aea935-20f3-4802-8e93-6603088be733\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.617734 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.617649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" Apr 20 16:29:14.733373 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.733339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp"] Apr 20 16:29:14.738141 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:29:14.738101 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91aea935_20f3_4802_8e93_6603088be733.slice/crio-d7fabc0c44607dbc03a64c61be4931b9d9c847d4d45393d5b2f4fd4da41f322e WatchSource:0}: Error finding container d7fabc0c44607dbc03a64c61be4931b9d9c847d4d45393d5b2f4fd4da41f322e: Status 404 returned error can't find the container with id d7fabc0c44607dbc03a64c61be4931b9d9c847d4d45393d5b2f4fd4da41f322e Apr 20 16:29:14.739376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:14.739356 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:29:15.379924 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:15.379882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" event={"ID":"91aea935-20f3-4802-8e93-6603088be733","Type":"ContainerStarted","Data":"d7fabc0c44607dbc03a64c61be4931b9d9c847d4d45393d5b2f4fd4da41f322e"} Apr 20 16:29:17.386133 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:17.386046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" event={"ID":"91aea935-20f3-4802-8e93-6603088be733","Type":"ContainerStarted","Data":"161df952a61c2f3d1f035ddccc3d3f609770db7b4bd70994aed6e9e9f65b4458"} Apr 20 16:29:17.402239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:17.402184 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k4zbp" podStartSLOduration=1.007878973 podStartE2EDuration="3.402169271s" podCreationTimestamp="2026-04-20 16:29:14 +0000 UTC" firstStartedPulling="2026-04-20 16:29:14.739517553 +0000 UTC m=+362.965725473" lastFinishedPulling="2026-04-20 16:29:17.133807852 +0000 UTC m=+365.360015771" observedRunningTime="2026-04-20 16:29:17.400563866 +0000 UTC m=+365.626771807" watchObservedRunningTime="2026-04-20 16:29:17.402169271 +0000 UTC m=+365.628377213" Apr 20 16:29:34.939119 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.939085 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm"] Apr 20 16:29:34.942215 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.942185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:34.944870 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.944849 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 16:29:34.944993 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.944880 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 16:29:34.945150 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.945134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kwhtg\"" Apr 20 16:29:34.945150 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.945145 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 16:29:34.945273 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.945235 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 16:29:34.956817 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:34.956794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm"] Apr 20 16:29:35.056416 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.056375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.056416 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.056414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9qp5\" (UniqueName: \"kubernetes.io/projected/372ab175-146b-459d-9716-a7cf657d987d-kube-api-access-w9qp5\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.056636 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.056448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.156923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.156887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.156923 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.156926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9qp5\" (UniqueName: \"kubernetes.io/projected/372ab175-146b-459d-9716-a7cf657d987d-kube-api-access-w9qp5\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.157120 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.156969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.159322 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.159291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-webhook-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.159422 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.159323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/372ab175-146b-459d-9716-a7cf657d987d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.172390 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.172358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9qp5\" (UniqueName: \"kubernetes.io/projected/372ab175-146b-459d-9716-a7cf657d987d-kube-api-access-w9qp5\") pod \"opendatahub-operator-controller-manager-59c64b9875-wr4hm\" (UID: \"372ab175-146b-459d-9716-a7cf657d987d\") " pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.252611 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.252550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:35.374560 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.374529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm"] Apr 20 16:29:35.377934 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:29:35.377904 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod372ab175_146b_459d_9716_a7cf657d987d.slice/crio-a4810dd0edac5c582e879bf66406e47b4171384a142004e48a17d8523891b6bf WatchSource:0}: Error finding container a4810dd0edac5c582e879bf66406e47b4171384a142004e48a17d8523891b6bf: Status 404 returned error can't find the container with id a4810dd0edac5c582e879bf66406e47b4171384a142004e48a17d8523891b6bf Apr 20 16:29:35.431136 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:35.431104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" event={"ID":"372ab175-146b-459d-9716-a7cf657d987d","Type":"ContainerStarted","Data":"a4810dd0edac5c582e879bf66406e47b4171384a142004e48a17d8523891b6bf"} Apr 20 16:29:38.440283 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:38.440249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" event={"ID":"372ab175-146b-459d-9716-a7cf657d987d","Type":"ContainerStarted","Data":"97a1793740ac15df30280ccdb71af6936df1a2a48def194a20c6c854b71092f9"} Apr 20 16:29:38.440671 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:38.440397 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:38.466820 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:38.466769 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" podStartSLOduration=1.951548479 podStartE2EDuration="4.466735248s" podCreationTimestamp="2026-04-20 16:29:34 +0000 UTC" firstStartedPulling="2026-04-20 16:29:35.37955835 +0000 UTC m=+383.605766273" lastFinishedPulling="2026-04-20 16:29:37.894745122 +0000 UTC m=+386.120953042" observedRunningTime="2026-04-20 16:29:38.46519485 +0000 UTC m=+386.691402813" watchObservedRunningTime="2026-04-20 16:29:38.466735248 +0000 UTC m=+386.692943259" Apr 20 16:29:49.446354 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:49.446323 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-59c64b9875-wr4hm" Apr 20 16:29:52.412261 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.412227 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z"] Apr 20 16:29:52.415413 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.415394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.418162 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.418122 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 16:29:52.418162 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.418142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 16:29:52.418334 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.418146 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 16:29:52.419423 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.419403 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 16:29:52.419525 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.419507 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8vzjl\"" Apr 20 16:29:52.423121 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.423101 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z"] Apr 20 16:29:52.475244 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.475201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/121fb884-b4fb-4849-96fd-e84b3d5a0884-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.475422 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.475251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csx4j\" (UniqueName: \"kubernetes.io/projected/121fb884-b4fb-4849-96fd-e84b3d5a0884-kube-api-access-csx4j\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.475505 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.475475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/121fb884-b4fb-4849-96fd-e84b3d5a0884-tmp\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.576376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.576321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/121fb884-b4fb-4849-96fd-e84b3d5a0884-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.576376 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.576376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csx4j\" (UniqueName: \"kubernetes.io/projected/121fb884-b4fb-4849-96fd-e84b3d5a0884-kube-api-access-csx4j\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.576583 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.576408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/121fb884-b4fb-4849-96fd-e84b3d5a0884-tmp\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.578709 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.578685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/121fb884-b4fb-4849-96fd-e84b3d5a0884-tmp\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.578962 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.578941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/121fb884-b4fb-4849-96fd-e84b3d5a0884-tls-certs\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.588588 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.588557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csx4j\" (UniqueName: \"kubernetes.io/projected/121fb884-b4fb-4849-96fd-e84b3d5a0884-kube-api-access-csx4j\") pod \"kube-auth-proxy-7b9c9c888c-98w6z\" (UID: \"121fb884-b4fb-4849-96fd-e84b3d5a0884\") " pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.724966 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.724864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" Apr 20 16:29:52.846172 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:52.846083 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z"] Apr 20 16:29:52.848809 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:29:52.848777 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod121fb884_b4fb_4849_96fd_e84b3d5a0884.slice/crio-579fbe04479fdae87ef8bc3910c6723232974faeea5fab9629f9188f92a65686 WatchSource:0}: Error finding container 579fbe04479fdae87ef8bc3910c6723232974faeea5fab9629f9188f92a65686: Status 404 returned error can't find the container with id 579fbe04479fdae87ef8bc3910c6723232974faeea5fab9629f9188f92a65686 Apr 20 16:29:53.483490 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:53.483436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" event={"ID":"121fb884-b4fb-4849-96fd-e84b3d5a0884","Type":"ContainerStarted","Data":"579fbe04479fdae87ef8bc3910c6723232974faeea5fab9629f9188f92a65686"} Apr 20 16:29:55.380334 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.380298 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-zwzxb"] Apr 20 16:29:55.383312 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.383292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:55.385826 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.385802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 16:29:55.385958 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.385826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-bbjxs\"" Apr 20 16:29:55.390573 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.390543 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-zwzxb"] Apr 20 16:29:55.498547 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.498513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdfr8\" (UniqueName: \"kubernetes.io/projected/00b40740-eb5d-47b4-89d8-20cf339ccaad-kube-api-access-hdfr8\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:55.498730 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.498568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:55.599726 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.599692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdfr8\" (UniqueName: \"kubernetes.io/projected/00b40740-eb5d-47b4-89d8-20cf339ccaad-kube-api-access-hdfr8\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:55.599925 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.599747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:55.599925 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:29:55.599911 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 16:29:55.600017 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:29:55.599965 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert podName:00b40740-eb5d-47b4-89d8-20cf339ccaad nodeName:}" failed. No retries permitted until 2026-04-20 16:29:56.099949545 +0000 UTC m=+404.326157465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert") pod "odh-model-controller-858dbf95b8-zwzxb" (UID: "00b40740-eb5d-47b4-89d8-20cf339ccaad") : secret "odh-model-controller-webhook-cert" not found Apr 20 16:29:55.610944 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:55.610897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdfr8\" (UniqueName: \"kubernetes.io/projected/00b40740-eb5d-47b4-89d8-20cf339ccaad-kube-api-access-hdfr8\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:56.105390 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.105335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:56.108184 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.108154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00b40740-eb5d-47b4-89d8-20cf339ccaad-cert\") pod \"odh-model-controller-858dbf95b8-zwzxb\" (UID: \"00b40740-eb5d-47b4-89d8-20cf339ccaad\") " pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:56.293586 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.293543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:56.416129 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.416094 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-zwzxb"] Apr 20 16:29:56.419627 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:29:56.419597 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b40740_eb5d_47b4_89d8_20cf339ccaad.slice/crio-87f993b3fef08f94982d05e57ef74b638a697afac5f295c632214ef80f4da355 WatchSource:0}: Error finding container 87f993b3fef08f94982d05e57ef74b638a697afac5f295c632214ef80f4da355: Status 404 returned error can't find the container with id 87f993b3fef08f94982d05e57ef74b638a697afac5f295c632214ef80f4da355 Apr 20 16:29:56.492856 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.492802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" event={"ID":"121fb884-b4fb-4849-96fd-e84b3d5a0884","Type":"ContainerStarted","Data":"a9b325ce7a1900f3bc1c751799655e17a4261d9a7a8df8a9342e7f70071c462d"} Apr 20 16:29:56.494351 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.494316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" event={"ID":"00b40740-eb5d-47b4-89d8-20cf339ccaad","Type":"ContainerStarted","Data":"87f993b3fef08f94982d05e57ef74b638a697afac5f295c632214ef80f4da355"} Apr 20 16:29:56.508860 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:56.508804 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b9c9c888c-98w6z" podStartSLOduration=1.116113261 podStartE2EDuration="4.508787645s" podCreationTimestamp="2026-04-20 16:29:52 +0000 UTC" firstStartedPulling="2026-04-20 16:29:52.850513387 +0000 UTC m=+401.076721311" lastFinishedPulling="2026-04-20 16:29:56.243187775 +0000 UTC m=+404.469395695" observedRunningTime="2026-04-20 16:29:56.507625642 +0000 UTC m=+404.733833585" watchObservedRunningTime="2026-04-20 16:29:56.508787645 +0000 UTC m=+404.734995583" Apr 20 16:29:59.504572 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:59.504537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" event={"ID":"00b40740-eb5d-47b4-89d8-20cf339ccaad","Type":"ContainerStarted","Data":"cfc356e6bb03d8359173b2db474c2743aa21b09f94ab6caf0268a8e3dd5205fe"} Apr 20 16:29:59.504948 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:59.504656 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:29:59.520806 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:29:59.520739 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" podStartSLOduration=1.5382945270000001 podStartE2EDuration="4.520722236s" podCreationTimestamp="2026-04-20 16:29:55 +0000 UTC" firstStartedPulling="2026-04-20 16:29:56.42091903 +0000 UTC m=+404.647126953" lastFinishedPulling="2026-04-20 16:29:59.403346737 +0000 UTC m=+407.629554662" observedRunningTime="2026-04-20 16:29:59.519953688 +0000 UTC m=+407.746161644" watchObservedRunningTime="2026-04-20 16:29:59.520722236 +0000 UTC m=+407.746930177" Apr 20 16:30:00.508747 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.508714 2576 generic.go:358] "Generic (PLEG): container finished" podID="00b40740-eb5d-47b4-89d8-20cf339ccaad" containerID="cfc356e6bb03d8359173b2db474c2743aa21b09f94ab6caf0268a8e3dd5205fe" exitCode=1 Apr 20 16:30:00.509117 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.508775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" event={"ID":"00b40740-eb5d-47b4-89d8-20cf339ccaad","Type":"ContainerDied","Data":"cfc356e6bb03d8359173b2db474c2743aa21b09f94ab6caf0268a8e3dd5205fe"} Apr 20 16:30:00.509117 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.509004 2576 scope.go:117] "RemoveContainer" containerID="cfc356e6bb03d8359173b2db474c2743aa21b09f94ab6caf0268a8e3dd5205fe" Apr 20 16:30:00.749961 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.749931 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r66jj"] Apr 20 16:30:00.753045 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.753022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:00.756523 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.756500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 16:30:00.758297 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.758276 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-4wr4n\"" Apr 20 16:30:00.771231 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.771207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r66jj"] Apr 20 16:30:00.844720 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.844683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8x65\" (UniqueName: \"kubernetes.io/projected/6a4e8404-a71d-445f-9f36-93fca6e28194-kube-api-access-h8x65\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:00.844908 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.844752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:00.946117 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.946074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:00.946296 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.946173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8x65\" (UniqueName: \"kubernetes.io/projected/6a4e8404-a71d-445f-9f36-93fca6e28194-kube-api-access-h8x65\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:00.946296 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:00.946236 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 16:30:00.946383 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:00.946303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert podName:6a4e8404-a71d-445f-9f36-93fca6e28194 nodeName:}" failed. No retries permitted until 2026-04-20 16:30:01.446287106 +0000 UTC m=+409.672495026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert") pod "kserve-controller-manager-856948b99f-r66jj" (UID: "6a4e8404-a71d-445f-9f36-93fca6e28194") : secret "kserve-webhook-server-cert" not found Apr 20 16:30:00.956509 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:00.956475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8x65\" (UniqueName: \"kubernetes.io/projected/6a4e8404-a71d-445f-9f36-93fca6e28194-kube-api-access-h8x65\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:01.450156 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:01.450121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:01.450314 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:01.450247 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 16:30:01.450314 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:01.450303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert podName:6a4e8404-a71d-445f-9f36-93fca6e28194 nodeName:}" failed. No retries permitted until 2026-04-20 16:30:02.45028827 +0000 UTC m=+410.676496190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert") pod "kserve-controller-manager-856948b99f-r66jj" (UID: "6a4e8404-a71d-445f-9f36-93fca6e28194") : secret "kserve-webhook-server-cert" not found Apr 20 16:30:01.513213 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:01.513175 2576 generic.go:358] "Generic (PLEG): container finished" podID="00b40740-eb5d-47b4-89d8-20cf339ccaad" containerID="5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4" exitCode=1 Apr 20 16:30:01.513639 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:01.513236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" event={"ID":"00b40740-eb5d-47b4-89d8-20cf339ccaad","Type":"ContainerDied","Data":"5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4"} Apr 20 16:30:01.513639 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:01.513273 2576 scope.go:117] "RemoveContainer" containerID="cfc356e6bb03d8359173b2db474c2743aa21b09f94ab6caf0268a8e3dd5205fe" Apr 20 16:30:01.513639 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:01.513469 2576 scope.go:117] "RemoveContainer" containerID="5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4" Apr 20 16:30:01.513783 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:01.513651 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-zwzxb_opendatahub(00b40740-eb5d-47b4-89d8-20cf339ccaad)\"" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" podUID="00b40740-eb5d-47b4-89d8-20cf339ccaad" Apr 20 16:30:02.457487 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:02.457450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:02.459988 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:02.459961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a4e8404-a71d-445f-9f36-93fca6e28194-cert\") pod \"kserve-controller-manager-856948b99f-r66jj\" (UID: \"6a4e8404-a71d-445f-9f36-93fca6e28194\") " pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:02.517050 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:02.517015 2576 scope.go:117] "RemoveContainer" containerID="5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4" Apr 20 16:30:02.517487 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:02.517233 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-zwzxb_opendatahub(00b40740-eb5d-47b4-89d8-20cf339ccaad)\"" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" podUID="00b40740-eb5d-47b4-89d8-20cf339ccaad" Apr 20 16:30:02.562167 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:02.562127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:02.678854 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:02.678815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r66jj"] Apr 20 16:30:02.681831 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:30:02.681805 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a4e8404_a71d_445f_9f36_93fca6e28194.slice/crio-87b183a7f7b8b0c7fbb3f76778cfda1c596e901e5f4d037ff6c079543fc3064e WatchSource:0}: Error finding container 87b183a7f7b8b0c7fbb3f76778cfda1c596e901e5f4d037ff6c079543fc3064e: Status 404 returned error can't find the container with id 87b183a7f7b8b0c7fbb3f76778cfda1c596e901e5f4d037ff6c079543fc3064e Apr 20 16:30:03.520464 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:03.520428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" event={"ID":"6a4e8404-a71d-445f-9f36-93fca6e28194","Type":"ContainerStarted","Data":"87b183a7f7b8b0c7fbb3f76778cfda1c596e901e5f4d037ff6c079543fc3064e"} Apr 20 16:30:06.530991 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.530950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" event={"ID":"6a4e8404-a71d-445f-9f36-93fca6e28194","Type":"ContainerStarted","Data":"d8ef35c82d35b44fa525c73cac999f64a42d82025c584b3b9b3182c2681a4dd5"} Apr 20 16:30:06.531442 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.531166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:06.549922 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.549863 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" podStartSLOduration=3.610627291 podStartE2EDuration="6.549850026s" podCreationTimestamp="2026-04-20 16:30:00 +0000 UTC" firstStartedPulling="2026-04-20 16:30:02.683209563 +0000 UTC m=+410.909417483" lastFinishedPulling="2026-04-20 16:30:05.622432299 +0000 UTC m=+413.848640218" observedRunningTime="2026-04-20 16:30:06.54864899 +0000 UTC m=+414.774856932" watchObservedRunningTime="2026-04-20 16:30:06.549850026 +0000 UTC m=+414.776057967" Apr 20 16:30:06.920683 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.920596 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h"] Apr 20 16:30:06.924894 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.924872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:06.929088 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.929069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 16:30:06.929192 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.929134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 16:30:06.929192 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.929156 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-v25q6\"" Apr 20 16:30:06.946796 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:06.946751 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h"] Apr 20 16:30:07.089499 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.089451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4h5h\" (UniqueName: \"kubernetes.io/projected/836b8b63-5507-4886-9997-0a1287899dfc-kube-api-access-k4h5h\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.089683 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.089528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/836b8b63-5507-4886-9997-0a1287899dfc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.190130 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.190031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/836b8b63-5507-4886-9997-0a1287899dfc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.190130 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.190103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4h5h\" (UniqueName: \"kubernetes.io/projected/836b8b63-5507-4886-9997-0a1287899dfc-kube-api-access-k4h5h\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.192600 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.192580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/836b8b63-5507-4886-9997-0a1287899dfc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.198444 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.198415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4h5h\" (UniqueName: \"kubernetes.io/projected/836b8b63-5507-4886-9997-0a1287899dfc-kube-api-access-k4h5h\") pod \"servicemesh-operator3-55f49c5f94-gnq9h\" (UID: \"836b8b63-5507-4886-9997-0a1287899dfc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.233716 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.233678 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:07.357771 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.357708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h"] Apr 20 16:30:07.362215 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:30:07.362177 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod836b8b63_5507_4886_9997_0a1287899dfc.slice/crio-f8c073518e97784c6ab00d25cdc33ff225d9d4d965729d0ffeeff9a00e8520f0 WatchSource:0}: Error finding container f8c073518e97784c6ab00d25cdc33ff225d9d4d965729d0ffeeff9a00e8520f0: Status 404 returned error can't find the container with id f8c073518e97784c6ab00d25cdc33ff225d9d4d965729d0ffeeff9a00e8520f0 Apr 20 16:30:07.535139 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:07.535103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" event={"ID":"836b8b63-5507-4886-9997-0a1287899dfc","Type":"ContainerStarted","Data":"f8c073518e97784c6ab00d25cdc33ff225d9d4d965729d0ffeeff9a00e8520f0"} Apr 20 16:30:09.505014 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:09.504980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:30:09.505438 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:09.505414 2576 scope.go:117] "RemoveContainer" containerID="5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4" Apr 20 16:30:09.505643 ip-10-0-142-44 kubenswrapper[2576]: E0420 16:30:09.505622 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-zwzxb_opendatahub(00b40740-eb5d-47b4-89d8-20cf339ccaad)\"" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" podUID="00b40740-eb5d-47b4-89d8-20cf339ccaad" Apr 20 16:30:10.550979 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:10.550938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" event={"ID":"836b8b63-5507-4886-9997-0a1287899dfc","Type":"ContainerStarted","Data":"c51945d5b70322974c22b794bf57b4b9e78f7e404ae09e87b58e681eeb773f49"} Apr 20 16:30:10.551447 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:10.551011 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:10.571142 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:10.571079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" podStartSLOduration=1.691939965 podStartE2EDuration="4.571058894s" podCreationTimestamp="2026-04-20 16:30:06 +0000 UTC" firstStartedPulling="2026-04-20 16:30:07.364691102 +0000 UTC m=+415.590899024" lastFinishedPulling="2026-04-20 16:30:10.243810024 +0000 UTC m=+418.470017953" observedRunningTime="2026-04-20 16:30:10.569152049 +0000 UTC m=+418.795359990" watchObservedRunningTime="2026-04-20 16:30:10.571058894 +0000 UTC m=+418.797266836" Apr 20 16:30:16.294178 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:16.294139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:30:16.294569 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:16.294550 2576 scope.go:117] "RemoveContainer" containerID="5d498190300ea05b662f4ccb80dcebd1e18102914f7a59bf1c9abceef768ceb4" Apr 20 16:30:17.574906 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:17.574873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" event={"ID":"00b40740-eb5d-47b4-89d8-20cf339ccaad","Type":"ContainerStarted","Data":"4eb52f1b20d9e786aceefbbda55d0ecab4a7456f49f80cd1bc7e735b4a1a55b2"} Apr 20 16:30:17.575302 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:17.575094 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:30:21.556450 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:21.556421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gnq9h" Apr 20 16:30:28.580795 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:28.580746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-zwzxb" Apr 20 16:30:37.405157 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.405075 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl"] Apr 20 16:30:37.409334 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.409314 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.412326 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.412128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 16:30:37.412326 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.412207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 16:30:37.412326 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.412218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 16:30:37.412326 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.412284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-n5jwg\"" Apr 20 16:30:37.412326 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.412302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 16:30:37.418166 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.418139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl"] Apr 20 16:30:37.513020 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.512979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513020 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d55126c5-0a68-45f4-a6de-2a589a931055-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.513270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.513262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jld55\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-kube-api-access-jld55\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.540964 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.540935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-r66jj" Apr 20 16:30:37.613942 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.613898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.613956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d55126c5-0a68-45f4-a6de-2a589a931055-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.613982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.614027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614152 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.614055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614437 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.614406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.614572 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.614460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jld55\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-kube-api-access-jld55\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.615108 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.615082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.617035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.616976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.617035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.617016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.617284 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.617059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d55126c5-0a68-45f4-a6de-2a589a931055-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.618436 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.618023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d55126c5-0a68-45f4-a6de-2a589a931055-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.623812 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.623733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.624024 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.623964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jld55\" (UniqueName: \"kubernetes.io/projected/d55126c5-0a68-45f4-a6de-2a589a931055-kube-api-access-jld55\") pod \"istiod-openshift-gateway-55ff986f96-sd6vl\" (UID: \"d55126c5-0a68-45f4-a6de-2a589a931055\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.719546 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.719441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:37.849276 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:37.849248 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl"] Apr 20 16:30:37.852144 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:30:37.852100 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55126c5_0a68_45f4_a6de_2a589a931055.slice/crio-70693ed17b4eb966fce1cbab08ee2c4609e5fe2eec2b504e46a31aa0c42c78e0 WatchSource:0}: Error finding container 70693ed17b4eb966fce1cbab08ee2c4609e5fe2eec2b504e46a31aa0c42c78e0: Status 404 returned error can't find the container with id 70693ed17b4eb966fce1cbab08ee2c4609e5fe2eec2b504e46a31aa0c42c78e0 Apr 20 16:30:38.640749 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:38.640715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" event={"ID":"d55126c5-0a68-45f4-a6de-2a589a931055","Type":"ContainerStarted","Data":"70693ed17b4eb966fce1cbab08ee2c4609e5fe2eec2b504e46a31aa0c42c78e0"} Apr 20 16:30:40.671798 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:40.671739 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 16:30:40.672071 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:40.671834 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 16:30:41.653127 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:41.653086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" event={"ID":"d55126c5-0a68-45f4-a6de-2a589a931055","Type":"ContainerStarted","Data":"518fcc31c5737289ff0f807530c80592d242fc794470a68469cad5ebcaa243b1"} Apr 20 16:30:41.653325 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:41.653305 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:30:41.654885 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:41.654857 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-sd6vl container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 16:30:41.655001 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:41.654905 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" podUID="d55126c5-0a68-45f4-a6de-2a589a931055" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 16:30:41.671139 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:41.671081 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" podStartSLOduration=1.853714717 podStartE2EDuration="4.671066006s" podCreationTimestamp="2026-04-20 16:30:37 +0000 UTC" firstStartedPulling="2026-04-20 16:30:37.854165523 +0000 UTC m=+446.080373447" lastFinishedPulling="2026-04-20 16:30:40.671516812 +0000 UTC m=+448.897724736" observedRunningTime="2026-04-20 16:30:41.670836384 +0000 UTC m=+449.897044328" watchObservedRunningTime="2026-04-20 16:30:41.671066006 +0000 UTC m=+449.897273948" Apr 20 16:30:42.657595 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:30:42.657567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-sd6vl" Apr 20 16:31:33.043468 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.043430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dmzwm"] Apr 20 16:31:33.046323 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.046303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:33.051796 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.051772 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 16:31:33.051908 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.051798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-kx66k\"" Apr 20 16:31:33.051972 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.051952 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 16:31:33.066333 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.066306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dmzwm"] Apr 20 16:31:33.157362 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.157323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4d6\" (UniqueName: \"kubernetes.io/projected/03d385b5-2557-4579-b802-db01cd525052-kube-api-access-mx4d6\") pod \"authorino-operator-657f44b778-dmzwm\" (UID: \"03d385b5-2557-4579-b802-db01cd525052\") " pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:33.257797 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.257735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4d6\" (UniqueName: \"kubernetes.io/projected/03d385b5-2557-4579-b802-db01cd525052-kube-api-access-mx4d6\") pod \"authorino-operator-657f44b778-dmzwm\" (UID: \"03d385b5-2557-4579-b802-db01cd525052\") " pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:33.269328 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.269300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4d6\" (UniqueName: \"kubernetes.io/projected/03d385b5-2557-4579-b802-db01cd525052-kube-api-access-mx4d6\") pod \"authorino-operator-657f44b778-dmzwm\" (UID: \"03d385b5-2557-4579-b802-db01cd525052\") " pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:33.357652 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.357566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:33.503470 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.503428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-dmzwm"] Apr 20 16:31:33.506600 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:31:33.506567 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d385b5_2557_4579_b802_db01cd525052.slice/crio-00e6de1d0a879786ea6d9650d336251ef88548d9378326f3e3df8bf6cca081bd WatchSource:0}: Error finding container 00e6de1d0a879786ea6d9650d336251ef88548d9378326f3e3df8bf6cca081bd: Status 404 returned error can't find the container with id 00e6de1d0a879786ea6d9650d336251ef88548d9378326f3e3df8bf6cca081bd Apr 20 16:31:33.817005 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:33.816972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" event={"ID":"03d385b5-2557-4579-b802-db01cd525052","Type":"ContainerStarted","Data":"00e6de1d0a879786ea6d9650d336251ef88548d9378326f3e3df8bf6cca081bd"} Apr 20 16:31:35.824503 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:35.824461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" event={"ID":"03d385b5-2557-4579-b802-db01cd525052","Type":"ContainerStarted","Data":"f48d64668d9cb8cf3370f7bc6e38238f53501b16ab252475e32e0a63f476e7ef"} Apr 20 16:31:35.824915 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:35.824588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:35.883322 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:35.883266 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" podStartSLOduration=1.165152057 podStartE2EDuration="2.88325088s" podCreationTimestamp="2026-04-20 16:31:33 +0000 UTC" firstStartedPulling="2026-04-20 16:31:33.508558826 +0000 UTC m=+501.734766747" lastFinishedPulling="2026-04-20 16:31:35.226657647 +0000 UTC m=+503.452865570" observedRunningTime="2026-04-20 16:31:35.881297704 +0000 UTC m=+504.107505656" watchObservedRunningTime="2026-04-20 16:31:35.88325088 +0000 UTC m=+504.109458879" Apr 20 16:31:46.829978 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:46.829946 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-dmzwm" Apr 20 16:31:49.012190 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.012153 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb"] Apr 20 16:31:49.019011 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.018969 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.022008 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.021977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-lk4cg\"" Apr 20 16:31:49.028731 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.028705 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb"] Apr 20 16:31:49.076035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.076001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnndp\" (UniqueName: \"kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.076197 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.076043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.177370 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.177327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnndp\" (UniqueName: \"kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.177370 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.177371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.177724 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.177706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.189526 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.189487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnndp\" (UniqueName: \"kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp\") pod \"kuadrant-operator-controller-manager-84b657d985-qbddb\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.289077 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.288989 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb"] Apr 20 16:31:49.289239 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.289226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:49.304953 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.304389 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb"] Apr 20 16:31:49.310625 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.310591 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:31:49.316836 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.316809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.332808 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.332744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:31:49.379020 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.378987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9lw\" (UniqueName: \"kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.379184 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.379042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.479556 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.479511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9lw\" (UniqueName: \"kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.479702 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.479621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.480042 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.480022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.488131 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.488099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9lw\" (UniqueName: \"kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw\") pod \"kuadrant-operator-controller-manager-84b657d985-vf4ms\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.627576 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.627463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:49.750039 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.750007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:31:49.752344 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:31:49.752310 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce2afe9_276c_417a_a055_17936aa323a7.slice/crio-a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a WatchSource:0}: Error finding container a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a: Status 404 returned error can't find the container with id a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a Apr 20 16:31:49.867438 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:49.867399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" event={"ID":"dce2afe9-276c-417a-a055-17936aa323a7","Type":"ContainerStarted","Data":"a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a"} Apr 20 16:31:51.923150 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:31:51.923113 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4151d0_0b7f_4610_9a15_98648221b665.slice/crio-24e6ed755a5a53d1c6e882763b936946872f541aa4d26eb2a4ac691d66bc302e WatchSource:0}: Error finding container 24e6ed755a5a53d1c6e882763b936946872f541aa4d26eb2a4ac691d66bc302e: Status 404 returned error can't find the container with id 24e6ed755a5a53d1c6e882763b936946872f541aa4d26eb2a4ac691d66bc302e Apr 20 16:31:54.889218 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.889118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" event={"ID":"dce2afe9-276c-417a-a055-17936aa323a7","Type":"ContainerStarted","Data":"f79fd6be195dea80f592540b247eca4966c8c1babb72edda43c038d3801e855e"} Apr 20 16:31:54.889218 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.889186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:31:54.890575 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.890546 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c4151d0-0b7f-4610-9a15-98648221b665" containerID="13c0ed8b30d2c787b6c742f70d8d38faac0692cd208491317c7f856ac5ba21d5" exitCode=1 Apr 20 16:31:54.909894 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.909847 2576 status_manager.go:895] "Failed to get status for pod" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" err="pods \"kuadrant-operator-controller-manager-84b657d985-qbddb\" is forbidden: User \"system:node:ip-10-0-142-44.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-44.ec2.internal' and this object" Apr 20 16:31:54.910039 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.909990 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" podStartSLOduration=1.376529321 podStartE2EDuration="5.909974365s" podCreationTimestamp="2026-04-20 16:31:49 +0000 UTC" firstStartedPulling="2026-04-20 16:31:49.754825637 +0000 UTC m=+517.981033557" lastFinishedPulling="2026-04-20 16:31:54.288270678 +0000 UTC m=+522.514478601" observedRunningTime="2026-04-20 16:31:54.907438973 +0000 UTC m=+523.133646915" watchObservedRunningTime="2026-04-20 16:31:54.909974365 +0000 UTC m=+523.136182308" Apr 20 16:31:54.920512 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.920490 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:54.923015 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:54.922988 2576 status_manager.go:895] "Failed to get status for pod" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" err="pods \"kuadrant-operator-controller-manager-84b657d985-qbddb\" is forbidden: User \"system:node:ip-10-0-142-44.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-44.ec2.internal' and this object" Apr 20 16:31:55.018710 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.018666 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume\") pod \"2c4151d0-0b7f-4610-9a15-98648221b665\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " Apr 20 16:31:55.018710 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.018715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnndp\" (UniqueName: \"kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp\") pod \"2c4151d0-0b7f-4610-9a15-98648221b665\" (UID: \"2c4151d0-0b7f-4610-9a15-98648221b665\") " Apr 20 16:31:55.018962 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.018939 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2c4151d0-0b7f-4610-9a15-98648221b665" (UID: "2c4151d0-0b7f-4610-9a15-98648221b665"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:31:55.020887 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.020864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp" (OuterVolumeSpecName: "kube-api-access-tnndp") pod "2c4151d0-0b7f-4610-9a15-98648221b665" (UID: "2c4151d0-0b7f-4610-9a15-98648221b665"). InnerVolumeSpecName "kube-api-access-tnndp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:31:55.119412 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.119374 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2c4151d0-0b7f-4610-9a15-98648221b665-extensions-socket-volume\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:31:55.119412 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.119406 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnndp\" (UniqueName: \"kubernetes.io/projected/2c4151d0-0b7f-4610-9a15-98648221b665-kube-api-access-tnndp\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:31:55.895193 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.895158 2576 scope.go:117] "RemoveContainer" containerID="13c0ed8b30d2c787b6c742f70d8d38faac0692cd208491317c7f856ac5ba21d5" Apr 20 16:31:55.895193 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.895178 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" Apr 20 16:31:55.897590 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.897561 2576 status_manager.go:895] "Failed to get status for pod" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" err="pods \"kuadrant-operator-controller-manager-84b657d985-qbddb\" is forbidden: User \"system:node:ip-10-0-142-44.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-44.ec2.internal' and this object" Apr 20 16:31:55.908783 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:55.908737 2576 status_manager.go:895] "Failed to get status for pod" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-qbddb" err="pods \"kuadrant-operator-controller-manager-84b657d985-qbddb\" is forbidden: User \"system:node:ip-10-0-142-44.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-44.ec2.internal' and this object" Apr 20 16:31:56.363115 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:31:56.363080 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" path="/var/lib/kubelet/pods/2c4151d0-0b7f-4610-9a15-98648221b665/volumes" Apr 20 16:32:05.898155 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:05.898073 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:32:18.725052 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.725014 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:32:18.725557 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.725285 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" podUID="dce2afe9-276c-417a-a055-17936aa323a7" containerName="manager" containerID="cri-o://f79fd6be195dea80f592540b247eca4966c8c1babb72edda43c038d3801e855e" gracePeriod=10 Apr 20 16:32:18.969012 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.968980 2576 generic.go:358] "Generic (PLEG): container finished" podID="dce2afe9-276c-417a-a055-17936aa323a7" containerID="f79fd6be195dea80f592540b247eca4966c8c1babb72edda43c038d3801e855e" exitCode=0 Apr 20 16:32:18.969175 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.969063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" event={"ID":"dce2afe9-276c-417a-a055-17936aa323a7","Type":"ContainerDied","Data":"f79fd6be195dea80f592540b247eca4966c8c1babb72edda43c038d3801e855e"} Apr 20 16:32:18.969175 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.969102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" event={"ID":"dce2afe9-276c-417a-a055-17936aa323a7","Type":"ContainerDied","Data":"a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a"} Apr 20 16:32:18.969175 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.969112 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e0476a39e394636883f6dc108f8cd07932f00dcda6b971429f6e572add411a" Apr 20 16:32:18.973708 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.973684 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:32:18.993830 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.993730 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume\") pod \"dce2afe9-276c-417a-a055-17936aa323a7\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " Apr 20 16:32:18.993830 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.993814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh9lw\" (UniqueName: \"kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw\") pod \"dce2afe9-276c-417a-a055-17936aa323a7\" (UID: \"dce2afe9-276c-417a-a055-17936aa323a7\") " Apr 20 16:32:18.994247 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.994187 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "dce2afe9-276c-417a-a055-17936aa323a7" (UID: "dce2afe9-276c-417a-a055-17936aa323a7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 16:32:18.996334 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:18.996299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw" (OuterVolumeSpecName: "kube-api-access-lh9lw") pod "dce2afe9-276c-417a-a055-17936aa323a7" (UID: "dce2afe9-276c-417a-a055-17936aa323a7"). InnerVolumeSpecName "kube-api-access-lh9lw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:32:19.094645 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:19.094602 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/dce2afe9-276c-417a-a055-17936aa323a7-extensions-socket-volume\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:32:19.094645 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:19.094638 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lh9lw\" (UniqueName: \"kubernetes.io/projected/dce2afe9-276c-417a-a055-17936aa323a7-kube-api-access-lh9lw\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:32:19.971835 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:19.971796 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms" Apr 20 16:32:20.001115 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:20.001077 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:32:20.006823 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:20.006790 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-vf4ms"] Apr 20 16:32:20.362652 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:20.362610 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dce2afe9-276c-417a-a055-17936aa323a7" path="/var/lib/kubelet/pods/dce2afe9-276c-417a-a055-17936aa323a7/volumes" Apr 20 16:32:22.596335 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596267 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg"] Apr 20 16:32:22.596705 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596619 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dce2afe9-276c-417a-a055-17936aa323a7" containerName="manager" Apr 20 16:32:22.596705 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596632 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce2afe9-276c-417a-a055-17936aa323a7" containerName="manager" Apr 20 16:32:22.596705 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596642 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" containerName="manager" Apr 20 16:32:22.596705 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596648 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" containerName="manager" Apr 20 16:32:22.596705 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596701 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dce2afe9-276c-417a-a055-17936aa323a7" containerName="manager" Apr 20 16:32:22.596960 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.596710 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c4151d0-0b7f-4610-9a15-98648221b665" containerName="manager" Apr 20 16:32:22.600878 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.600846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.603662 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.603631 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-wjp5v\"" Apr 20 16:32:22.611791 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.611742 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg"] Apr 20 16:32:22.723619 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.723836 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.723836 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728d6\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-kube-api-access-728d6\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.723836 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.723836 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9d716d0-989b-40b4-9996-e83ea38ff758-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.724052 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.724052 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.724052 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.724052 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.723960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824605 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824605 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-728d6\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-kube-api-access-728d6\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9d716d0-989b-40b4-9996-e83ea38ff758-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.824891 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.824913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.825042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825177 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.825101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825378 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.825357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825601 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.825576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c9d716d0-989b-40b4-9996-e83ea38ff758-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.825674 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.825590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.827158 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.827134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.827283 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.827268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.833463 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.833433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.833584 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.833566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-728d6\" (UniqueName: \"kubernetes.io/projected/c9d716d0-989b-40b4-9996-e83ea38ff758-kube-api-access-728d6\") pod \"maas-default-gateway-openshift-default-58b6f876-nrghg\" (UID: \"c9d716d0-989b-40b4-9996-e83ea38ff758\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:22.914545 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:22.914445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:23.047594 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:23.047563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg"] Apr 20 16:32:23.050163 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:32:23.050131 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d716d0_989b_40b4_9996_e83ea38ff758.slice/crio-86b2d7d523840ac590086ba30064a1703fa98e35d415bee49bb9b951c2655f46 WatchSource:0}: Error finding container 86b2d7d523840ac590086ba30064a1703fa98e35d415bee49bb9b951c2655f46: Status 404 returned error can't find the container with id 86b2d7d523840ac590086ba30064a1703fa98e35d415bee49bb9b951c2655f46 Apr 20 16:32:23.987615 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:23.987575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" event={"ID":"c9d716d0-989b-40b4-9996-e83ea38ff758","Type":"ContainerStarted","Data":"86b2d7d523840ac590086ba30064a1703fa98e35d415bee49bb9b951c2655f46"} Apr 20 16:32:26.596998 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:26.596960 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 16:32:26.597285 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:26.597050 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 16:32:26.597285 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:26.597080 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 16:32:26.999102 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:26.999020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" event={"ID":"c9d716d0-989b-40b4-9996-e83ea38ff758","Type":"ContainerStarted","Data":"e168e6d0e1b1d81f8f28040dffc65fd504332ef0c050529631894c8c082f1b3f"} Apr 20 16:32:27.018030 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:27.017979 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" podStartSLOduration=1.473431302 podStartE2EDuration="5.017965488s" podCreationTimestamp="2026-04-20 16:32:22 +0000 UTC" firstStartedPulling="2026-04-20 16:32:23.05218911 +0000 UTC m=+551.278397029" lastFinishedPulling="2026-04-20 16:32:26.596723295 +0000 UTC m=+554.822931215" observedRunningTime="2026-04-20 16:32:27.017381539 +0000 UTC m=+555.243589481" watchObservedRunningTime="2026-04-20 16:32:27.017965488 +0000 UTC m=+555.244173487" Apr 20 16:32:27.915196 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:27.915150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:27.920131 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:27.920106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:28.002411 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:28.002382 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:28.003495 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:28.003474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nrghg" Apr 20 16:32:36.684117 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.684079 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:36.687773 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.687739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.690380 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.690358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 16:32:36.690513 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.690358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-fc88d\"" Apr 20 16:32:36.696727 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.696694 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:36.744383 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.744347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.744618 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.744399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tq9n\" (UniqueName: \"kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.776215 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.776178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:36.845842 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.845803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.846051 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.845882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tq9n\" (UniqueName: \"kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.846489 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.846466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.855147 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.855121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tq9n\" (UniqueName: \"kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n\") pod \"limitador-limitador-7d549b5b-skz29\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:36.998445 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:36.998351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:37.125011 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:37.124976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:37.127419 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:32:37.127388 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc348c51d_9f28_4087_a5b8_7f11eb4faf5f.slice/crio-cef6943c9bb744eb918eefa30000454e1161e6e9c7b4ce37ad98980ed75d9a45 WatchSource:0}: Error finding container cef6943c9bb744eb918eefa30000454e1161e6e9c7b4ce37ad98980ed75d9a45: Status 404 returned error can't find the container with id cef6943c9bb744eb918eefa30000454e1161e6e9c7b4ce37ad98980ed75d9a45 Apr 20 16:32:38.044678 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:38.044416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" event={"ID":"c348c51d-9f28-4087-a5b8-7f11eb4faf5f","Type":"ContainerStarted","Data":"cef6943c9bb744eb918eefa30000454e1161e6e9c7b4ce37ad98980ed75d9a45"} Apr 20 16:32:40.053741 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:40.053703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" event={"ID":"c348c51d-9f28-4087-a5b8-7f11eb4faf5f","Type":"ContainerStarted","Data":"2ee5208dee671776db145a15b31de687c47f87422965c0df3dcf01dcbf24c1cd"} Apr 20 16:32:40.054132 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:40.053771 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:40.070398 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:40.070289 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" podStartSLOduration=1.3893806149999999 podStartE2EDuration="4.070274848s" podCreationTimestamp="2026-04-20 16:32:36 +0000 UTC" firstStartedPulling="2026-04-20 16:32:37.129184418 +0000 UTC m=+565.355392341" lastFinishedPulling="2026-04-20 16:32:39.810078649 +0000 UTC m=+568.036286574" observedRunningTime="2026-04-20 16:32:40.068623036 +0000 UTC m=+568.294830982" watchObservedRunningTime="2026-04-20 16:32:40.070274848 +0000 UTC m=+568.296482789" Apr 20 16:32:51.058873 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:51.058836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:51.620023 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:51.619971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:51.621270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:51.621217 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" podUID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" containerName="limitador" containerID="cri-o://2ee5208dee671776db145a15b31de687c47f87422965c0df3dcf01dcbf24c1cd" gracePeriod=30 Apr 20 16:32:52.093441 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.093411 2576 generic.go:358] "Generic (PLEG): container finished" podID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" containerID="2ee5208dee671776db145a15b31de687c47f87422965c0df3dcf01dcbf24c1cd" exitCode=0 Apr 20 16:32:52.093908 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.093479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" event={"ID":"c348c51d-9f28-4087-a5b8-7f11eb4faf5f","Type":"ContainerDied","Data":"2ee5208dee671776db145a15b31de687c47f87422965c0df3dcf01dcbf24c1cd"} Apr 20 16:32:52.159886 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.159859 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:52.274397 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.274365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tq9n\" (UniqueName: \"kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n\") pod \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " Apr 20 16:32:52.274551 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.274419 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file\") pod \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\" (UID: \"c348c51d-9f28-4087-a5b8-7f11eb4faf5f\") " Apr 20 16:32:52.274818 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.274789 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file" (OuterVolumeSpecName: "config-file") pod "c348c51d-9f28-4087-a5b8-7f11eb4faf5f" (UID: "c348c51d-9f28-4087-a5b8-7f11eb4faf5f"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 16:32:52.276461 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.276432 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n" (OuterVolumeSpecName: "kube-api-access-4tq9n") pod "c348c51d-9f28-4087-a5b8-7f11eb4faf5f" (UID: "c348c51d-9f28-4087-a5b8-7f11eb4faf5f"). InnerVolumeSpecName "kube-api-access-4tq9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 16:32:52.375866 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.375838 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tq9n\" (UniqueName: \"kubernetes.io/projected/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-kube-api-access-4tq9n\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:32:52.375866 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:52.375865 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c348c51d-9f28-4087-a5b8-7f11eb4faf5f-config-file\") on node \"ip-10-0-142-44.ec2.internal\" DevicePath \"\"" Apr 20 16:32:53.097853 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.097813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" event={"ID":"c348c51d-9f28-4087-a5b8-7f11eb4faf5f","Type":"ContainerDied","Data":"cef6943c9bb744eb918eefa30000454e1161e6e9c7b4ce37ad98980ed75d9a45"} Apr 20 16:32:53.098321 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.097863 2576 scope.go:117] "RemoveContainer" containerID="2ee5208dee671776db145a15b31de687c47f87422965c0df3dcf01dcbf24c1cd" Apr 20 16:32:53.098321 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.097898 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-skz29" Apr 20 16:32:53.114842 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.114816 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:53.118251 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.118230 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-skz29"] Apr 20 16:32:53.880506 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.880471 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-wvgxx"] Apr 20 16:32:53.880783 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.880751 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" containerName="limitador" Apr 20 16:32:53.880783 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.880779 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" containerName="limitador" Apr 20 16:32:53.880860 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.880839 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" containerName="limitador" Apr 20 16:32:53.885304 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.885276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:53.888035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.888008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 16:32:53.888035 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.888008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-skg2d\"" Apr 20 16:32:53.893643 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.893225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wvgxx"] Apr 20 16:32:53.989373 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.989339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z48st\" (UniqueName: \"kubernetes.io/projected/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-kube-api-access-z48st\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:53.989522 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:53.989407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-data\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.090643 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.090602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z48st\" (UniqueName: \"kubernetes.io/projected/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-kube-api-access-z48st\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.090849 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.090667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-data\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.091044 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.091026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-data\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.099256 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.099229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z48st\" (UniqueName: \"kubernetes.io/projected/4fb832e3-c3b0-4734-9b91-faa9d764fcf1-kube-api-access-z48st\") pod \"postgres-868db5846d-wvgxx\" (UID: \"4fb832e3-c3b0-4734-9b91-faa9d764fcf1\") " pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.198736 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.198638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:32:54.323171 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.323133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wvgxx"] Apr 20 16:32:54.327465 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:32:54.327425 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb832e3_c3b0_4734_9b91_faa9d764fcf1.slice/crio-2a75966850233cf3f1aa9e4b5f5854b6793f10e0c58d632db4cc920fa7d4b565 WatchSource:0}: Error finding container 2a75966850233cf3f1aa9e4b5f5854b6793f10e0c58d632db4cc920fa7d4b565: Status 404 returned error can't find the container with id 2a75966850233cf3f1aa9e4b5f5854b6793f10e0c58d632db4cc920fa7d4b565 Apr 20 16:32:54.363586 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:54.363541 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c348c51d-9f28-4087-a5b8-7f11eb4faf5f" path="/var/lib/kubelet/pods/c348c51d-9f28-4087-a5b8-7f11eb4faf5f/volumes" Apr 20 16:32:55.106441 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:32:55.106397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wvgxx" event={"ID":"4fb832e3-c3b0-4734-9b91-faa9d764fcf1","Type":"ContainerStarted","Data":"2a75966850233cf3f1aa9e4b5f5854b6793f10e0c58d632db4cc920fa7d4b565"} Apr 20 16:33:03.136865 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:03.136829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wvgxx" event={"ID":"4fb832e3-c3b0-4734-9b91-faa9d764fcf1","Type":"ContainerStarted","Data":"8dbdcb73183ee08334d1edcebdebeddd2f3eb8dc4d907690a0a658264d1b950a"} Apr 20 16:33:03.137307 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:03.136934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:33:03.152707 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:03.152656 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-wvgxx" podStartSLOduration=1.877814635 podStartE2EDuration="10.152642141s" podCreationTimestamp="2026-04-20 16:32:53 +0000 UTC" firstStartedPulling="2026-04-20 16:32:54.328887823 +0000 UTC m=+582.555095749" lastFinishedPulling="2026-04-20 16:33:02.603715335 +0000 UTC m=+590.829923255" observedRunningTime="2026-04-20 16:33:03.151163553 +0000 UTC m=+591.377371496" watchObservedRunningTime="2026-04-20 16:33:03.152642141 +0000 UTC m=+591.378850082" Apr 20 16:33:09.168631 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:09.168598 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-wvgxx" Apr 20 16:33:12.255453 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:12.255426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:33:12.255884 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:12.255794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:33:18.212363 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.212324 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mpdgw"] Apr 20 16:33:18.217145 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.217123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" Apr 20 16:33:18.221287 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.221255 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-8wxbv\"" Apr 20 16:33:18.221417 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.221296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 16:33:18.222671 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.222646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 16:33:18.227426 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.227398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mpdgw"] Apr 20 16:33:18.272114 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.272075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r898x\" (UniqueName: \"kubernetes.io/projected/4735432b-d53f-4224-9661-b79cb1fe1401-kube-api-access-r898x\") pod \"keycloak-operator-5c4df598dd-mpdgw\" (UID: \"4735432b-d53f-4224-9661-b79cb1fe1401\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" Apr 20 16:33:18.373279 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.373245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r898x\" (UniqueName: \"kubernetes.io/projected/4735432b-d53f-4224-9661-b79cb1fe1401-kube-api-access-r898x\") pod \"keycloak-operator-5c4df598dd-mpdgw\" (UID: \"4735432b-d53f-4224-9661-b79cb1fe1401\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" Apr 20 16:33:18.386085 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.386048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r898x\" (UniqueName: \"kubernetes.io/projected/4735432b-d53f-4224-9661-b79cb1fe1401-kube-api-access-r898x\") pod \"keycloak-operator-5c4df598dd-mpdgw\" (UID: \"4735432b-d53f-4224-9661-b79cb1fe1401\") " pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" Apr 20 16:33:18.527658 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.527626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" Apr 20 16:33:18.648559 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:18.648532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-mpdgw"] Apr 20 16:33:18.651146 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:33:18.651114 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4735432b_d53f_4224_9661_b79cb1fe1401.slice/crio-a867aaa8d2f404d03b3deb7e807a7533073c49b264f64176a9be458c688db62a WatchSource:0}: Error finding container a867aaa8d2f404d03b3deb7e807a7533073c49b264f64176a9be458c688db62a: Status 404 returned error can't find the container with id a867aaa8d2f404d03b3deb7e807a7533073c49b264f64176a9be458c688db62a Apr 20 16:33:19.189316 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:19.189281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" event={"ID":"4735432b-d53f-4224-9661-b79cb1fe1401","Type":"ContainerStarted","Data":"a867aaa8d2f404d03b3deb7e807a7533073c49b264f64176a9be458c688db62a"} Apr 20 16:33:25.216605 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:25.216561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" event={"ID":"4735432b-d53f-4224-9661-b79cb1fe1401","Type":"ContainerStarted","Data":"b6ba4c461948a7e8642a087ede38478ecfa20bdeea94e685015da6a265368c8b"} Apr 20 16:33:25.233086 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:33:25.233027 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-mpdgw" podStartSLOduration=1.146991986 podStartE2EDuration="7.233008952s" podCreationTimestamp="2026-04-20 16:33:18 +0000 UTC" firstStartedPulling="2026-04-20 16:33:18.652539535 +0000 UTC m=+606.878747455" lastFinishedPulling="2026-04-20 16:33:24.738556501 +0000 UTC m=+612.964764421" observedRunningTime="2026-04-20 16:33:25.232112144 +0000 UTC m=+613.458320126" watchObservedRunningTime="2026-04-20 16:33:25.233008952 +0000 UTC m=+613.459216895" Apr 20 16:37:38.757250 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:38.757218 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-r66jj_6a4e8404-a71d-445f-9f36-93fca6e28194/manager/0.log" Apr 20 16:37:39.135019 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:39.134923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-zwzxb_00b40740-eb5d-47b4-89d8-20cf339ccaad/manager/2.log" Apr 20 16:37:39.364612 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:39.364581 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-wr4hm_372ab175-146b-459d-9716-a7cf657d987d/manager/0.log" Apr 20 16:37:39.603990 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:39.603954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wvgxx_4fb832e3-c3b0-4734-9b91-faa9d764fcf1/postgres/0.log" Apr 20 16:37:41.023285 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:41.023254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dmzwm_03d385b5-2557-4579-b802-db01cd525052/manager/0.log" Apr 20 16:37:42.229948 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:42.229894 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-sd6vl_d55126c5-0a68-45f4-a6de-2a589a931055/discovery/0.log" Apr 20 16:37:42.351492 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:42.351459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9c9c888c-98w6z_121fb884-b4fb-4849-96fd-e84b3d5a0884/kube-auth-proxy/0.log" Apr 20 16:37:42.580543 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:42.580513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-nrghg_c9d716d0-989b-40b4-9996-e83ea38ff758/istio-proxy/0.log" Apr 20 16:37:50.354090 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:50.354058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-28hkj_09560999-6ebd-4da5-b805-d700919dfb04/global-pull-secret-syncer/0.log" Apr 20 16:37:50.514390 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:50.514358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7q74k_b0527932-43f9-4c44-8a48-e6b0fc353de6/konnectivity-agent/0.log" Apr 20 16:37:50.610629 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:50.610538 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-44.ec2.internal_1fe42efe7b737ea774d310634568d2b9/haproxy/0.log" Apr 20 16:37:55.016587 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:55.016552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-dmzwm_03d385b5-2557-4579-b802-db01cd525052/manager/0.log" Apr 20 16:37:56.984270 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:56.984226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-99k55_f2193334-65a2-4f35-bbed-0117cbe5d424/node-exporter/0.log" Apr 20 16:37:57.004818 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:57.004788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-99k55_f2193334-65a2-4f35-bbed-0117cbe5d424/kube-rbac-proxy/0.log" Apr 20 16:37:57.026044 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:57.026019 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-99k55_f2193334-65a2-4f35-bbed-0117cbe5d424/init-textfile/0.log" Apr 20 16:37:59.106845 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.106809 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d"] Apr 20 16:37:59.110212 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.110187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.112693 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.112672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wg92x\"/\"default-dockercfg-9rm4f\"" Apr 20 16:37:59.112826 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.112695 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"kube-root-ca.crt\"" Apr 20 16:37:59.113770 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.113745 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wg92x\"/\"openshift-service-ca.crt\"" Apr 20 16:37:59.116769 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.116729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d"] Apr 20 16:37:59.141351 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.141297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-proc\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.141483 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.141415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-sys\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.141483 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.141441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-podres\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.141580 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.141493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rmt\" (UniqueName: \"kubernetes.io/projected/5393a22d-b32f-42c4-ab10-1d86d12503bd-kube-api-access-t9rmt\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.141580 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.141530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-lib-modules\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242564 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-proc\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-sys\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-podres\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rmt\" (UniqueName: \"kubernetes.io/projected/5393a22d-b32f-42c4-ab10-1d86d12503bd-kube-api-access-t9rmt\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-lib-modules\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-proc\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-sys\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.242802 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-podres\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.243073 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.242804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5393a22d-b32f-42c4-ab10-1d86d12503bd-lib-modules\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.250511 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.250486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rmt\" (UniqueName: \"kubernetes.io/projected/5393a22d-b32f-42c4-ab10-1d86d12503bd-kube-api-access-t9rmt\") pod \"perf-node-gather-daemonset-mql4d\" (UID: \"5393a22d-b32f-42c4-ab10-1d86d12503bd\") " pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.436129 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.436030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:37:59.591685 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.591501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d"] Apr 20 16:37:59.594405 ip-10-0-142-44 kubenswrapper[2576]: W0420 16:37:59.594375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5393a22d_b32f_42c4_ab10_1d86d12503bd.slice/crio-e8db873f02891ac22564bbc9cbab4a1a8ed9149380f78778e5c307869f93f311 WatchSource:0}: Error finding container e8db873f02891ac22564bbc9cbab4a1a8ed9149380f78778e5c307869f93f311: Status 404 returned error can't find the container with id e8db873f02891ac22564bbc9cbab4a1a8ed9149380f78778e5c307869f93f311 Apr 20 16:37:59.595868 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:37:59.595853 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 16:38:00.124363 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:00.124328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" event={"ID":"5393a22d-b32f-42c4-ab10-1d86d12503bd","Type":"ContainerStarted","Data":"1f75f33a88e2119f203c2268b5ea99a1eacf7f479785147c728e8d020865f362"} Apr 20 16:38:00.124363 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:00.124364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" event={"ID":"5393a22d-b32f-42c4-ab10-1d86d12503bd","Type":"ContainerStarted","Data":"e8db873f02891ac22564bbc9cbab4a1a8ed9149380f78778e5c307869f93f311"} Apr 20 16:38:00.124840 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:00.124468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:38:00.144691 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:00.144639 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" podStartSLOduration=1.144624904 podStartE2EDuration="1.144624904s" podCreationTimestamp="2026-04-20 16:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 16:38:00.141357734 +0000 UTC m=+888.367565675" watchObservedRunningTime="2026-04-20 16:38:00.144624904 +0000 UTC m=+888.370832846" Apr 20 16:38:01.276931 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:01.276902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t2rgz_c0c73998-a3cf-46ac-88ee-04698be10974/dns/0.log" Apr 20 16:38:01.299151 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:01.299124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t2rgz_c0c73998-a3cf-46ac-88ee-04698be10974/kube-rbac-proxy/0.log" Apr 20 16:38:01.372005 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:01.371963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tnl7m_0f424042-eb12-467e-85c1-cbdd302c3e4d/dns-node-resolver/0.log" Apr 20 16:38:01.894958 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:01.894928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wn2xt_bc10ffb1-dd19-4a22-a3ed-7437a80f1ba7/node-ca/0.log" Apr 20 16:38:02.796663 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:02.796630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-sd6vl_d55126c5-0a68-45f4-a6de-2a589a931055/discovery/0.log" Apr 20 16:38:02.816993 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:02.816960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9c9c888c-98w6z_121fb884-b4fb-4849-96fd-e84b3d5a0884/kube-auth-proxy/0.log" Apr 20 16:38:02.868565 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:02.868531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-nrghg_c9d716d0-989b-40b4-9996-e83ea38ff758/istio-proxy/0.log" Apr 20 16:38:03.367724 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:03.367688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7f8db_0dfe4973-64f0-41f7-a34e-6d35be53c155/serve-healthcheck-canary/0.log" Apr 20 16:38:04.080325 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:04.080296 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7jzmt_7ec95865-41d4-4612-acad-c1c0a5433c03/kube-rbac-proxy/0.log" Apr 20 16:38:04.111403 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:04.111378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7jzmt_7ec95865-41d4-4612-acad-c1c0a5433c03/exporter/0.log" Apr 20 16:38:04.147450 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:04.147424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7jzmt_7ec95865-41d4-4612-acad-c1c0a5433c03/extractor/0.log" Apr 20 16:38:06.139096 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.139056 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-r66jj_6a4e8404-a71d-445f-9f36-93fca6e28194/manager/0.log" Apr 20 16:38:06.139500 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.139369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wg92x/perf-node-gather-daemonset-mql4d" Apr 20 16:38:06.215920 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.215881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-zwzxb_00b40740-eb5d-47b4-89d8-20cf339ccaad/manager/1.log" Apr 20 16:38:06.235104 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.235075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-zwzxb_00b40740-eb5d-47b4-89d8-20cf339ccaad/manager/2.log" Apr 20 16:38:06.286480 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.286450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-59c64b9875-wr4hm_372ab175-146b-459d-9716-a7cf657d987d/manager/0.log" Apr 20 16:38:06.361078 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:06.361010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wvgxx_4fb832e3-c3b0-4734-9b91-faa9d764fcf1/postgres/0.log" Apr 20 16:38:07.731136 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:07.731103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-k4zbp_91aea935-20f3-4802-8e93-6603088be733/openshift-lws-operator/0.log" Apr 20 16:38:12.278897 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:12.278868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:38:12.278897 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:12.278882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:38:12.291256 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:12.291228 2576 scope.go:117] "RemoveContainer" containerID="f79fd6be195dea80f592540b247eca4966c8c1babb72edda43c038d3801e855e" Apr 20 16:38:13.987446 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:13.987417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/kube-multus-additional-cni-plugins/0.log" Apr 20 16:38:14.016733 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.016701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/egress-router-binary-copy/0.log" Apr 20 16:38:14.040969 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.040941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/cni-plugins/0.log" Apr 20 16:38:14.067868 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.067843 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/bond-cni-plugin/0.log" Apr 20 16:38:14.090379 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.090350 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/routeoverride-cni/0.log" Apr 20 16:38:14.114106 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.114075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/whereabouts-cni-bincopy/0.log" Apr 20 16:38:14.137657 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.137630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wsgnn_1bb216ed-aa87-4017-b000-0f3d37d1fda9/whereabouts-cni/0.log" Apr 20 16:38:14.177356 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.177325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v7jxk_23487f52-5abf-4f26-b6e5-427ce8611cdb/kube-multus/0.log" Apr 20 16:38:14.294308 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.294280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tr5xd_7948c105-68aa-437a-a0ac-fa0d535c7b37/network-metrics-daemon/0.log" Apr 20 16:38:14.315536 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:14.315503 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tr5xd_7948c105-68aa-437a-a0ac-fa0d535c7b37/kube-rbac-proxy/0.log" Apr 20 16:38:15.813312 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.813275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-controller/0.log" Apr 20 16:38:15.830631 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.830595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/0.log" Apr 20 16:38:15.838654 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.838616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovn-acl-logging/1.log" Apr 20 16:38:15.862566 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.862536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/kube-rbac-proxy-node/0.log" Apr 20 16:38:15.885858 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.885822 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 16:38:15.905013 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.904982 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/northd/0.log" Apr 20 16:38:15.926977 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.926947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/nbdb/0.log" Apr 20 16:38:15.949288 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:15.949257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/sbdb/0.log" Apr 20 16:38:16.111129 ip-10-0-142-44 kubenswrapper[2576]: I0420 16:38:16.111052 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s95ld_3c78e1c2-fb6e-458b-8593-64d3e48a714e/ovnkube-controller/0.log"