Apr 16 22:11:03.690287 ip-10-0-130-16 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:11:03.690300 ip-10-0-130-16 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:11:03.690310 ip-10-0-130-16 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:11:03.690628 ip-10-0-130-16 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:14.750648 ip-10-0-130-16 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:14.750662 ip-10-0-130-16 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 030a1bd0ec984bb8af3351489d8cb30d -- Apr 16 22:13:43.548104 ip-10-0-130-16 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:44.104833 ip-10-0-130-16 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:44.104833 ip-10-0-130-16 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:44.104833 ip-10-0-130-16 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:44.104833 ip-10-0-130-16 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:44.104833 ip-10-0-130-16 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:44.108653 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.108554 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117733 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117757 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117775 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117781 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117785 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117788 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:44.117778 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117791 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117794 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117797 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117800 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117803 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117805 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117808 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117811 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117814 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117816 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117819 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117822 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117824 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117827 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117834 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117837 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117840 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117842 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117845 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117847 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:44.118063 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117850 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117854 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117858 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117862 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117865 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117867 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117870 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117873 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117876 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117879 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117882 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117884 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117887 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117890 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117895 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117898 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117900 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117903 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117905 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:44.118545 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117909 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117912 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117915 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117918 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117920 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117923 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117925 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117928 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117932 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117935 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117938 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117940 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117943 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117945 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117948 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117951 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117954 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117956 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117959 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117961 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:44.119061 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117964 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117967 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117969 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117972 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117974 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117977 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117979 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117982 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117985 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117989 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117992 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117995 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.117998 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118001 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118003 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118006 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118008 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118011 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118014 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118016 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:44.119585 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118018 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118859 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118866 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118869 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118873 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118876 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118879 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118882 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118885 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118888 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118891 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118894 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118898 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118900 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118904 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118906 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118909 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118912 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118915 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118918 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:44.120137 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118920 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118923 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118926 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118928 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118931 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118934 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118937 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118941 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118945 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118948 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118951 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118954 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118957 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118960 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118962 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118965 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118967 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118970 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118974 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118976 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:44.120631 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118979 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118981 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118984 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118987 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118989 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118992 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118994 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118997 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.118999 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119002 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119004 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119007 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119010 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119013 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119015 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119018 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119021 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119023 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119026 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119029 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:44.121153 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119032 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119034 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119037 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119040 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119043 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119045 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119048 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119050 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119053 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119055 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119058 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119061 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119063 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119066 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119068 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119071 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119074 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119076 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119078 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119081 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:44.121648 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119083 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119086 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119090 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119093 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119096 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119098 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.119101 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120560 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120570 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120576 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120581 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120587 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120591 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120595 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120600 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120603 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120606 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120610 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120613 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120617 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120620 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120623 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120626 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:44.122161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120629 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120632 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120636 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120641 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120644 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120648 2571 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120651 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120654 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120658 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120661 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120665 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120668 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120671 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120675 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120678 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120681 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120684 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120688 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120691 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120694 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120697 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120701 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120704 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120708 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120711 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:44.122722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120715 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120719 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120722 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120726 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120729 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120732 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120735 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120739 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120741 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120745 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120747 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120750 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120753 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120757 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120761 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120777 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120781 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120784 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120787 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120790 2571 flags.go:64] FLAG: --help="false" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120794 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120797 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120800 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:44.123357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120803 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120806 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120810 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120813 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120816 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120819 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120823 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120826 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120829 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120832 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120835 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120846 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120851 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120854 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120857 2571 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120860 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120864 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120867 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120873 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120876 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120879 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120881 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120884 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120888 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:44.123937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120891 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120894 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120899 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120902 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120906 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120909 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120912 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120915 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120917 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120921 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120924 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120927 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120940 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120943 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120946 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120951 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120954 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120960 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120963 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120966 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120969 2571 flags.go:64] FLAG: --port="10250" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120973 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120976 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04d86bd52ffd9768a" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120979 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:44.124518 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120987 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120990 2571 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120993 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.120997 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121001 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121004 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121007 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121010 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121014 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121018 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121020 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121023 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121026 2571 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121029 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121032 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121035 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121038 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121041 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121044 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121047 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121050 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121053 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121056 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121058 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121062 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121064 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:44.125165 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121067 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121070 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121075 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121078 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121082 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121086 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121091 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121094 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121097 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121099 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121102 2571 flags.go:64] FLAG: --v="2" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121107 2571 flags.go:64] FLAG: --version="false" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121112 2571 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121116 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.121120 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121218 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121222 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121225 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121228 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121231 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121233 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121236 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121238 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:44.125862 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121241 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121244 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121246 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121249 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121251 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121254 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121256 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121259 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121262 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121265 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121267 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121270 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121273 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121275 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121278 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121282 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121284 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121287 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121289 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121292 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:44.126451 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121294 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121297 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121300 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121302 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121305 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121308 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121310 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121313 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121315 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121318 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121321 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121323 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121326 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121328 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121331 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121333 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121336 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121339 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121342 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121344 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:44.126990 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121347 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121351 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121353 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121356 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121358 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121361 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121363 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121367 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121371 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121374 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121378 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121382 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121385 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121388 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121391 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121394 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121396 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121399 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121402 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:44.127488 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121404 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121407 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121409 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121412 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121414 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121417 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121419 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121422 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121424 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121427 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121430 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121432 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121434 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121437 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121439 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121442 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121445 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121447 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:44.128002 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.121450 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:44.128448 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.122111 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:44.129430 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.129303 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.129432 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129484 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129489 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129493 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129496 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129499 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129502 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129504 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129507 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129510 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129513 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:44.129511 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129516 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129519 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129522 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129524 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129527 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129530 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129532 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129535 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129538 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129541 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129544 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129547 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129550 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129552 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129555 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129558 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129560 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129563 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129565 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129568 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:44.129820 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129570 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129574 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129577 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129579 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129582 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129585 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129587 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129590 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129592 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129595 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129597 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129600 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129602 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129605 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129608 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129611 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129615 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129619 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129623 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:44.130310 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129625 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129628 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129631 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129633 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129636 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129639 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129642 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129645 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129647 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129650 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129653 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129655 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129658 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129660 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129663 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129668 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129673 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129675 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129678 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:44.130809 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129681 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129685 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129688 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129690 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129693 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129696 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129698 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129701 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129705 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129707 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129710 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129713 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129716 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129718 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129721 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129724 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129726 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:44.131284 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129729 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.129734 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129858 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129865 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129868 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129872 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129874 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129877 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129880 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129882 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129885 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129888 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129891 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129894 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129896 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:44.131702 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129899 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129901 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129904 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129906 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129909 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129911 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129914 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129917 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129920 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129922 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129925 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129928 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129930 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129933 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129936 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129938 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129941 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129943 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129946 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129949 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:44.132166 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129951 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129953 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129956 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129959 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129961 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129963 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129966 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129969 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129972 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129974 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129978 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129980 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129983 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129985 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129988 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129990 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129993 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129995 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.129998 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130002 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:44.132657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130006 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130010 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130012 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130015 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130018 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130020 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130023 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130025 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130028 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130032 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130035 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130038 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130041 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130044 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130047 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130049 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130052 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130054 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130057 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:44.133218 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130059 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130062 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130064 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130067 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130070 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130072 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130075 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130078 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130081 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130083 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130086 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130088 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130091 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:44.130093 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.130098 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:44.133725 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.130886 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:44.134128 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.133103 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:44.134341 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.134328 2571 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:44.134442 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.134423 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:44.134478 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.134469 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:44.166357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.166334 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:44.168834 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.168803 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:44.186170 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.186141 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:44.192370 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.192348 2571 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:44.193703 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.193687 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:44.194702 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.194686 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:44.199128 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.199102 2571 fs.go:135] Filesystem UUIDs: map[2ac81054-05c9-4ec3-b892-0a49611e3e3a:/dev/nvme0n1p3 6eb69a79-3bd0-43fb-9740-ad34a0fc2cb0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 22:13:44.199183 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.199128 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:44.205870 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.205739 2571 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:44.202949885 +0000 UTC m=+0.499658335 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3072888 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2441ed4223fe578a9476dc2cd2b892 SystemUUID:ec2441ed-4223-fe57-8a94-76dc2cd2b892 BootID:030a1bd0-ec98-4bb8-af33-51489d8cb30d Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8c:a7:41:9d:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8c:a7:41:9d:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:de:1f:61:81:7c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:44.205870 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.205864 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:44.206034 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206021 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:44.206383 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206356 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:44.206525 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206384 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:44.206574 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206534 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:44.206574 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206543 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:44.206574 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.206557 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:44.208387 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.208375 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:44.210072 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.210061 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:44.210187 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.210178 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:44.214801 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.214761 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dtc9w" Apr 16 22:13:44.215490 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.215477 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:44.215521 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.215502 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:44.215521 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.215515 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:44.215586 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.215529 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:44.215586 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.215545 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:44.216788 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.216761 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:44.216829 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.216801 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:44.218754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.218734 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dtc9w" Apr 16 22:13:44.220370 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.220355 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:44.221712 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.221695 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:44.223646 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223634 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223661 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223668 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223674 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223680 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223686 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223692 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223697 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223705 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:44.223707 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223712 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:44.223968 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223724 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:44.223968 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.223733 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:44.224669 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.224658 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:44.224669 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.224669 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:44.228673 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.228657 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:44.228804 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.228702 2571 server.go:1295] "Started kubelet" Apr 16 22:13:44.228804 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.228744 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:44.228899 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.228859 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:44.231299 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.231271 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:44.231541 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.231515 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:44.231906 ip-10-0-130-16 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:44.233092 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.233076 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:44.234423 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.234403 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:44.235701 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.235679 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:44.237476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.237459 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-16.ec2.internal" not found Apr 16 22:13:44.241027 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241006 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:44.241129 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241024 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:44.241730 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241710 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:44.241730 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241712 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:44.241901 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241742 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:44.241901 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.241853 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-16.ec2.internal\" not found" Apr 16 22:13:44.241969 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241906 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:44.241969 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.241916 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:44.242020 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242006 2571 factory.go:55] Registering systemd factory Apr 16 22:13:44.242020 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242019 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:44.242263 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242243 2571 factory.go:153] Registering CRI-O factory Apr 16 22:13:44.242263 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242265 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:44.242415 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242339 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:44.242415 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242365 2571 factory.go:103] Registering Raw factory Apr 16 22:13:44.242415 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242382 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:44.242672 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.242648 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:44.242802 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.242791 2571 manager.go:319] Starting recovery of all containers Apr 16 22:13:44.243265 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.243245 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:44.246753 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.246730 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-16.ec2.internal\" not found" node="ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.252021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.251995 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-16.ec2.internal" not found Apr 16 22:13:44.253695 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.253677 2571 manager.go:324] Recovery completed Apr 16 22:13:44.259846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.259829 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:44.262564 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.262544 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:44.262640 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.262575 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:44.262640 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.262585 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:44.263117 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.263102 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:44.263117 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.263113 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:44.263214 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.263133 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:44.265064 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.265052 2571 policy_none.go:49] "None policy: Start" Apr 16 22:13:44.265101 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.265069 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:44.265101 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.265079 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:44.304473 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.304448 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.304530 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.304544 2571 server.go:85] "Starting device plugin registration server" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.304815 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.304825 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.304925 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.305017 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.305027 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.305850 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.305886 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-16.ec2.internal\" not found" Apr 16 22:13:44.317197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.313513 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-16.ec2.internal" not found Apr 16 22:13:44.364833 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.364732 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:44.366567 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.366537 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:44.366567 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.366572 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:44.366755 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.366596 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:44.366755 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.366603 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:44.366755 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.366705 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:44.369967 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.369950 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:44.405679 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.405640 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:44.406666 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.406648 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:44.406759 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.406677 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:44.406759 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.406687 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:44.406759 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.406712 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.413085 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.413056 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.413179 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:44.413086 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-16.ec2.internal\": node \"ip-10-0-130-16.ec2.internal\" not found" Apr 16 22:13:44.467328 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.467284 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal"] Apr 16 22:13:44.469611 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.469594 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.469687 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.469606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.495703 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.495675 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.500251 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.500234 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.509264 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.509238 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:44.512238 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.512220 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:44.543595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.543560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.543789 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.543600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.543789 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.543625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7d17133b5a635b239940c82faeb49a51-config\") pod \"kube-apiserver-proxy-ip-10-0-130-16.ec2.internal\" (UID: \"7d17133b5a635b239940c82faeb49a51\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644618 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644618 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644618 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7d17133b5a635b239940c82faeb49a51-config\") pod \"kube-apiserver-proxy-ip-10-0-130-16.ec2.internal\" (UID: \"7d17133b5a635b239940c82faeb49a51\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644861 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7d17133b5a635b239940c82faeb49a51-config\") pod \"kube-apiserver-proxy-ip-10-0-130-16.ec2.internal\" (UID: \"7d17133b5a635b239940c82faeb49a51\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644861 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644641 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.644861 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.644645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9690e09cc5a9f42f04eae10c4110dd59-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal\" (UID: \"9690e09cc5a9f42f04eae10c4110dd59\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.813906 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.813864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" Apr 16 22:13:44.814090 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:44.813969 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" Apr 16 22:13:45.133941 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.133849 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:45.134682 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.134062 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:45.134682 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.134087 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:45.134682 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.134067 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:45.216006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.215967 2571 apiserver.go:52] "Watching apiserver" Apr 16 22:13:45.221397 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.221348 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:44 +0000 UTC" deadline="2027-10-17 12:16:10.43018389 +0000 UTC" Apr 16 22:13:45.221397 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.221390 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13166h2m25.208796592s" Apr 16 22:13:45.224309 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.224283 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:45.226179 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.226157 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7","openshift-dns/node-resolver-nwdxm","openshift-image-registry/node-ca-kjs6v","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal","openshift-multus/multus-additional-cni-plugins-jdp2m","openshift-multus/multus-q92dp","openshift-network-operator/iptables-alerter-4vrs4","kube-system/konnectivity-agent-r6qrd","openshift-cluster-node-tuning-operator/tuned-6h8wj","openshift-multus/network-metrics-daemon-89m59","openshift-network-diagnostics/network-check-target-gzcjz","openshift-ovn-kubernetes/ovnkube-node-88hfq"] Apr 16 22:13:45.228395 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.228374 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.228603 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.228569 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.229475 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.229455 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.230467 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.230433 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.233357 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233332 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.233476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233395 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.233476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233436 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.233476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jgllw\"" Apr 16 22:13:45.233630 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233339 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:45.233630 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233555 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.233776 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.233740 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:45.234087 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234069 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.234087 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234082 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k2vjm\"" Apr 16 22:13:45.234233 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234136 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5bh96\"" Apr 16 22:13:45.234233 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234068 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.234233 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234184 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:45.234233 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234213 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vpbk8\"" Apr 16 22:13:45.234414 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234388 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:45.236291 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234558 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:45.236291 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.236291 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.234847 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.236291 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.235491 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.236511 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.236331 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.236511 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.236499 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.237045 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.237028 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.237272 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.237253 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.237365 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.237322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:45.237425 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.237322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jhztv\"" Apr 16 22:13:45.237584 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.237567 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.238491 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.238455 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:45.238727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.238708 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.238863 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.238743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:45.238863 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.238747 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lvjq\"" Apr 16 22:13:45.238863 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.238843 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:45.239070 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.239058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:45.239158 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.239143 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-82tk7\"" Apr 16 22:13:45.239811 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.239746 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:45.239913 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.239889 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:45.240951 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.240932 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.241601 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.241582 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:45.242013 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.241894 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6j4vn\"" Apr 16 22:13:45.242013 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.241948 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.242013 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.242001 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.242941 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.242904 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:45.243109 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.243095 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:45.243390 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.243372 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:45.243430 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.243416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkljr\"" Apr 16 22:13:45.244545 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.244525 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:45.244636 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.244549 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:45.244636 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.244563 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:45.244636 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.244597 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:45.248105 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.248184 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhj7v\" (UniqueName: \"kubernetes.io/projected/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-kube-api-access-fhj7v\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.248184 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248148 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-multus\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.248184 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-kubelet\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.248184 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7a91f2b-d56f-4cd1-86e6-533c19173cab-serviceca\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-kubelet\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248213 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-log-socket\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-netd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-run\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.248311 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mh9f\" (UniqueName: \"kubernetes.io/projected/209bb68d-9579-4062-8913-98bd2b4c61a1-kube-api-access-9mh9f\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cni-binary-copy\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-socket-dir-parent\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-netns\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43ef0782-3c86-49cc-a384-b383bf10c750-hosts-file\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-konnectivity-ca\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.248492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-os-release\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.248686 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-config\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.248964 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysconfig\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249050 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.248987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-kubernetes\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249050 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-etc-kubernetes\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.249148 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-systemd-units\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249199 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249145 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-systemd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249199 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-bin\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249294 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-conf\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249294 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.249393 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.249393 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-ovn\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249393 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-modprobe-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249539 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cnibin\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.249539 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.249539 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-netns\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.249539 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4c633be-7a8f-41f5-ba11-b35591028a41-host-slash\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-node-log\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-systemd\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-tmp\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-hostroot\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.249737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-daemon-config\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-multus-certs\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-etc-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-agent-certs\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.249965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.250028 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-bin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250060 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq2s\" (UniqueName: \"kubernetes.io/projected/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-kube-api-access-pmq2s\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-sys\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4c633be-7a8f-41f5-ba11-b35591028a41-iptables-alerter-script\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjrkh\" (UniqueName: \"kubernetes.io/projected/b7a91f2b-d56f-4cd1-86e6-533c19173cab-kube-api-access-sjrkh\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.250286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250234 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cnibin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-var-lib-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-etc-tuned\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-device-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-k8s-cni-cncf-io\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-conf-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-slash\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f30e25-01b8-4134-8184-2a06ef526c62-ovn-node-metrics-cert\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43ef0782-3c86-49cc-a384-b383bf10c750-tmp-dir\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.250595 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-socket-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-system-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-os-release\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-env-overrides\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctw2\" (UniqueName: \"kubernetes.io/projected/a0f30e25-01b8-4134-8184-2a06ef526c62-kube-api-access-lctw2\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cfp8\" (UniqueName: \"kubernetes.io/projected/47517e74-f8d0-404e-8053-08ab938c2d36-kube-api-access-6cfp8\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2mt\" (UniqueName: \"kubernetes.io/projected/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-kube-api-access-5r2mt\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-sys-fs\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-system-cni-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-var-lib-kubelet\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-host\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.250999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-registration-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl85v\" (UniqueName: \"kubernetes.io/projected/43ef0782-3c86-49cc-a384-b383bf10c750-kube-api-access-pl85v\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-lib-modules\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7a91f2b-d56f-4cd1-86e6-533c19173cab-host\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74sn8\" (UniqueName: \"kubernetes.io/projected/f4c633be-7a8f-41f5-ba11-b35591028a41-kube-api-access-74sn8\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.252227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251148 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.252818 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.251184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-script-lib\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.252818 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.252356 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:45.282455 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.282285 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dh66m" Apr 16 22:13:45.288909 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.288872 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dh66m" Apr 16 22:13:45.289967 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.289928 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d17133b5a635b239940c82faeb49a51.slice/crio-4960691fad943365617a2105d27b21a51e5fb82511e8be992b6018f705285746 WatchSource:0}: Error finding container 4960691fad943365617a2105d27b21a51e5fb82511e8be992b6018f705285746: Status 404 returned error can't find the container with id 4960691fad943365617a2105d27b21a51e5fb82511e8be992b6018f705285746 Apr 16 22:13:45.290212 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.290193 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9690e09cc5a9f42f04eae10c4110dd59.slice/crio-84393c6ff3f7bf09bcd448703152ce18eef55b4a66dc9b97ef6bfd4276deae24 WatchSource:0}: Error finding container 84393c6ff3f7bf09bcd448703152ce18eef55b4a66dc9b97ef6bfd4276deae24: Status 404 returned error can't find the container with id 84393c6ff3f7bf09bcd448703152ce18eef55b4a66dc9b97ef6bfd4276deae24 Apr 16 22:13:45.295981 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.295962 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:45.351578 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjrkh\" (UniqueName: \"kubernetes.io/projected/b7a91f2b-d56f-4cd1-86e6-533c19173cab-kube-api-access-sjrkh\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.351578 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cnibin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-var-lib-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-etc-tuned\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351645 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-device-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-k8s-cni-cncf-io\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-conf-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-var-lib-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-slash\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cnibin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-k8s-cni-cncf-io\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f30e25-01b8-4134-8184-2a06ef526c62-ovn-node-metrics-cert\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.351846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-conf-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.351745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-device-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352092 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-slash\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43ef0782-3c86-49cc-a384-b383bf10c750-tmp-dir\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352225 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-socket-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-system-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.352333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-os-release\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-env-overrides\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lctw2\" (UniqueName: \"kubernetes.io/projected/a0f30e25-01b8-4134-8184-2a06ef526c62-kube-api-access-lctw2\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cfp8\" (UniqueName: \"kubernetes.io/projected/47517e74-f8d0-404e-8053-08ab938c2d36-kube-api-access-6cfp8\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2mt\" (UniqueName: \"kubernetes.io/projected/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-kube-api-access-5r2mt\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-sys-fs\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-system-cni-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-var-lib-kubelet\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-socket-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.352727 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-system-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.353144 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-os-release\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.353144 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.353101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43ef0782-3c86-49cc-a384-b383bf10c750-tmp-dir\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.355375 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.353332 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.355375 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.353437 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.853403882 +0000 UTC m=+2.150112336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.355375 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.353479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-sys-fs\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.355375 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.353540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-system-cni-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.352551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-var-lib-kubelet\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-host\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-registration-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl85v\" (UniqueName: \"kubernetes.io/projected/43ef0782-3c86-49cc-a384-b383bf10c750-kube-api-access-pl85v\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-registration-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-lib-modules\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-host\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.355661 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7a91f2b-d56f-4cd1-86e6-533c19173cab-host\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-env-overrides\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74sn8\" (UniqueName: \"kubernetes.io/projected/f4c633be-7a8f-41f5-ba11-b35591028a41-kube-api-access-74sn8\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-lib-modules\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355803 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7a91f2b-d56f-4cd1-86e6-533c19173cab-host\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-etc-tuned\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-script-lib\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0f30e25-01b8-4134-8184-2a06ef526c62-ovn-node-metrics-cert\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356325 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.355939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.356368 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhj7v\" (UniqueName: \"kubernetes.io/projected/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-kube-api-access-fhj7v\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.356401 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-multus\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356438 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-kubelet\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356483 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7a91f2b-d56f-4cd1-86e6-533c19173cab-serviceca\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.356536 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-multus\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356536 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-kubelet\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356536 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-kubelet\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-kubelet\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-log-socket\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-netd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356664 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-log-socket\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356676 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-run\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mh9f\" (UniqueName: \"kubernetes.io/projected/209bb68d-9579-4062-8913-98bd2b4c61a1-kube-api-access-9mh9f\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356703 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-netd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-run\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cni-binary-copy\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-socket-dir-parent\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-netns\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356886 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7a91f2b-d56f-4cd1-86e6-533c19173cab-serviceca\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-socket-dir-parent\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-script-lib\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43ef0782-3c86-49cc-a384-b383bf10c750-hosts-file\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-run-netns\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-konnectivity-ca\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-os-release\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.356984 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43ef0782-3c86-49cc-a384-b383bf10c750-hosts-file\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.356988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-config\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysconfig\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-kubernetes\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-etc-kubernetes\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysconfig\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-systemd-units\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-systemd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-kubernetes\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357117 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-etc-kubernetes\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-bin\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-systemd-units\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-conf\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-systemd\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357036 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-os-release\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-cni-bin\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.357728 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-etc-selinux\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-ovn\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-run-ovn\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357283 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-modprobe-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-conf\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cnibin\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-modprobe-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-netns\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357397 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cnibin\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-cni-binary-copy\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-cni-dir\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4c633be-7a8f-41f5-ba11-b35591028a41-host-slash\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-node-log\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357471 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-netns\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-konnectivity-ca\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.358402 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0f30e25-01b8-4134-8184-2a06ef526c62-ovnkube-config\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4c633be-7a8f-41f5-ba11-b35591028a41-host-slash\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-node-log\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-systemd\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-tmp\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-systemd\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-hostroot\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-daemon-config\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-multus-certs\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-hostroot\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-etc-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-agent-certs\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-bin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq2s\" (UniqueName: \"kubernetes.io/projected/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-kube-api-access-pmq2s\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.358958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0f30e25-01b8-4134-8184-2a06ef526c62-etc-openvswitch\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-sys\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-var-lib-cni-bin\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4c633be-7a8f-41f5-ba11-b35591028a41-iptables-alerter-script\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.357966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-host-run-multus-certs\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-etc-sysctl-d\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47517e74-f8d0-404e-8053-08ab938c2d36-sys\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/209bb68d-9579-4062-8913-98bd2b4c61a1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4c633be-7a8f-41f5-ba11-b35591028a41-iptables-alerter-script\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.359488 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.358949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-multus-daemon-config\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.360025 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.359748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47517e74-f8d0-404e-8053-08ab938c2d36-tmp\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.360114 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.360035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d31d911a-3d25-4b6f-ae92-8ec66384bbbf-agent-certs\") pod \"konnectivity-agent-r6qrd\" (UID: \"d31d911a-3d25-4b6f-ae92-8ec66384bbbf\") " pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.362641 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.362599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjrkh\" (UniqueName: \"kubernetes.io/projected/b7a91f2b-d56f-4cd1-86e6-533c19173cab-kube-api-access-sjrkh\") pod \"node-ca-kjs6v\" (UID: \"b7a91f2b-d56f-4cd1-86e6-533c19173cab\") " pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.363592 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.363559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cfp8\" (UniqueName: \"kubernetes.io/projected/47517e74-f8d0-404e-8053-08ab938c2d36-kube-api-access-6cfp8\") pod \"tuned-6h8wj\" (UID: \"47517e74-f8d0-404e-8053-08ab938c2d36\") " pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.363741 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.363726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2mt\" (UniqueName: \"kubernetes.io/projected/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-kube-api-access-5r2mt\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.363741 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.363734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl85v\" (UniqueName: \"kubernetes.io/projected/43ef0782-3c86-49cc-a384-b383bf10c750-kube-api-access-pl85v\") pod \"node-resolver-nwdxm\" (UID: \"43ef0782-3c86-49cc-a384-b383bf10c750\") " pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.364055 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.364041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74sn8\" (UniqueName: \"kubernetes.io/projected/f4c633be-7a8f-41f5-ba11-b35591028a41-kube-api-access-74sn8\") pod \"iptables-alerter-4vrs4\" (UID: \"f4c633be-7a8f-41f5-ba11-b35591028a41\") " pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.364295 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.364279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctw2\" (UniqueName: \"kubernetes.io/projected/a0f30e25-01b8-4134-8184-2a06ef526c62-kube-api-access-lctw2\") pod \"ovnkube-node-88hfq\" (UID: \"a0f30e25-01b8-4134-8184-2a06ef526c62\") " pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.364407 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.364390 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:45.364471 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.364414 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:45.364471 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.364429 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.364579 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.364567 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.864547096 +0000 UTC m=+2.161255536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.364951 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.364934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhj7v\" (UniqueName: \"kubernetes.io/projected/41a45a8e-91e8-4a52-90bd-9fddbbac91d0-kube-api-access-fhj7v\") pod \"multus-additional-cni-plugins-jdp2m\" (UID: \"41a45a8e-91e8-4a52-90bd-9fddbbac91d0\") " pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.366338 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.366322 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq2s\" (UniqueName: \"kubernetes.io/projected/efe2768a-1ae2-4e05-9621-e7a2cb669a2f-kube-api-access-pmq2s\") pod \"multus-q92dp\" (UID: \"efe2768a-1ae2-4e05-9621-e7a2cb669a2f\") " pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.366389 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.366374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mh9f\" (UniqueName: \"kubernetes.io/projected/209bb68d-9579-4062-8913-98bd2b4c61a1-kube-api-access-9mh9f\") pod \"aws-ebs-csi-driver-node-dl6n7\" (UID: \"209bb68d-9579-4062-8913-98bd2b4c61a1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.370081 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.370037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" event={"ID":"9690e09cc5a9f42f04eae10c4110dd59","Type":"ContainerStarted","Data":"84393c6ff3f7bf09bcd448703152ce18eef55b4a66dc9b97ef6bfd4276deae24"} Apr 16 22:13:45.370958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.370937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" event={"ID":"7d17133b5a635b239940c82faeb49a51","Type":"ContainerStarted","Data":"4960691fad943365617a2105d27b21a51e5fb82511e8be992b6018f705285746"} Apr 16 22:13:45.553929 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.553841 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" Apr 16 22:13:45.559877 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.559852 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod209bb68d_9579_4062_8913_98bd2b4c61a1.slice/crio-39c5bc069ab97963b7559b7bfb13c97502b12526f557cbc5cded484025c44652 WatchSource:0}: Error finding container 39c5bc069ab97963b7559b7bfb13c97502b12526f557cbc5cded484025c44652: Status 404 returned error can't find the container with id 39c5bc069ab97963b7559b7bfb13c97502b12526f557cbc5cded484025c44652 Apr 16 22:13:45.574831 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.574805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwdxm" Apr 16 22:13:45.580748 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.580715 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ef0782_3c86_49cc_a384_b383bf10c750.slice/crio-c12b43b0bd57d793563b87d055865ba98119c415fdc7ed69d26c2464ef530282 WatchSource:0}: Error finding container c12b43b0bd57d793563b87d055865ba98119c415fdc7ed69d26c2464ef530282: Status 404 returned error can't find the container with id c12b43b0bd57d793563b87d055865ba98119c415fdc7ed69d26c2464ef530282 Apr 16 22:13:45.592455 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.592430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kjs6v" Apr 16 22:13:45.598111 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.598092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" Apr 16 22:13:45.598391 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.598366 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a91f2b_d56f_4cd1_86e6_533c19173cab.slice/crio-c1c1de58108a7b6af82c7e95d76ba64c3ed4af3420d858125803b76711cd3dc6 WatchSource:0}: Error finding container c1c1de58108a7b6af82c7e95d76ba64c3ed4af3420d858125803b76711cd3dc6: Status 404 returned error can't find the container with id c1c1de58108a7b6af82c7e95d76ba64c3ed4af3420d858125803b76711cd3dc6 Apr 16 22:13:45.604516 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.604498 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4vrs4" Apr 16 22:13:45.605982 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.605958 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a45a8e_91e8_4a52_90bd_9fddbbac91d0.slice/crio-098b4188185c2d1e2a80cf19b2cd30cb2971ffc24cc1cfc2182e93505a535761 WatchSource:0}: Error finding container 098b4188185c2d1e2a80cf19b2cd30cb2971ffc24cc1cfc2182e93505a535761: Status 404 returned error can't find the container with id 098b4188185c2d1e2a80cf19b2cd30cb2971ffc24cc1cfc2182e93505a535761 Apr 16 22:13:45.610476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.610457 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q92dp" Apr 16 22:13:45.610700 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.610677 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c633be_7a8f_41f5_ba11_b35591028a41.slice/crio-8ff518371785d892910f32e270de98dc02f28863b8efc9f66a4515c0e5f97e58 WatchSource:0}: Error finding container 8ff518371785d892910f32e270de98dc02f28863b8efc9f66a4515c0e5f97e58: Status 404 returned error can't find the container with id 8ff518371785d892910f32e270de98dc02f28863b8efc9f66a4515c0e5f97e58 Apr 16 22:13:45.616285 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.616266 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:13:45.616758 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.616685 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe2768a_1ae2_4e05_9621_e7a2cb669a2f.slice/crio-fccb880f2adfa3920193586405336a1a2e1d1493fdee6cd3719b747ff2f3dcf4 WatchSource:0}: Error finding container fccb880f2adfa3920193586405336a1a2e1d1493fdee6cd3719b747ff2f3dcf4: Status 404 returned error can't find the container with id fccb880f2adfa3920193586405336a1a2e1d1493fdee6cd3719b747ff2f3dcf4 Apr 16 22:13:45.622431 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.622412 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" Apr 16 22:13:45.622657 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.622639 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31d911a_3d25_4b6f_ae92_8ec66384bbbf.slice/crio-a9001369c698acc0a8a46cb0d15386cde24f6f6199964c6a2bbf74f9cf330704 WatchSource:0}: Error finding container a9001369c698acc0a8a46cb0d15386cde24f6f6199964c6a2bbf74f9cf330704: Status 404 returned error can't find the container with id a9001369c698acc0a8a46cb0d15386cde24f6f6199964c6a2bbf74f9cf330704 Apr 16 22:13:45.627094 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.627026 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:13:45.628731 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.628709 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47517e74_f8d0_404e_8053_08ab938c2d36.slice/crio-1ebb612d52b1797f1ead8516e5490a17f7ae4a68a556d96bd44c912a933b7708 WatchSource:0}: Error finding container 1ebb612d52b1797f1ead8516e5490a17f7ae4a68a556d96bd44c912a933b7708: Status 404 returned error can't find the container with id 1ebb612d52b1797f1ead8516e5490a17f7ae4a68a556d96bd44c912a933b7708 Apr 16 22:13:45.633824 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:13:45.633801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f30e25_01b8_4134_8184_2a06ef526c62.slice/crio-b3d97f517d68cea7f1429d65f72c889030a4938cd1da9dbf628ba7453b2fc216 WatchSource:0}: Error finding container b3d97f517d68cea7f1429d65f72c889030a4938cd1da9dbf628ba7453b2fc216: Status 404 returned error can't find the container with id b3d97f517d68cea7f1429d65f72c889030a4938cd1da9dbf628ba7453b2fc216 Apr 16 22:13:45.862205 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.862120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:45.862358 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.862280 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.862358 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.862349 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.862325352 +0000 UTC m=+3.159033794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.964130 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:45.963464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:45.964130 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.963672 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:45.964130 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.963693 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:45.964130 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.963709 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.964130 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:45.963787 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.963749838 +0000 UTC m=+3.260458280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.289739 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.289655 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:45 +0000 UTC" deadline="2028-01-25 10:55:54.868680348 +0000 UTC" Apr 16 22:13:46.289739 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.289688 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15564h42m8.578996738s" Apr 16 22:13:46.315851 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.315817 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:46.378936 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.373057 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:46.378936 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.373214 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:46.398232 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.398107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" event={"ID":"47517e74-f8d0-404e-8053-08ab938c2d36","Type":"ContainerStarted","Data":"1ebb612d52b1797f1ead8516e5490a17f7ae4a68a556d96bd44c912a933b7708"} Apr 16 22:13:46.412293 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.412251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q92dp" event={"ID":"efe2768a-1ae2-4e05-9621-e7a2cb669a2f","Type":"ContainerStarted","Data":"fccb880f2adfa3920193586405336a1a2e1d1493fdee6cd3719b747ff2f3dcf4"} Apr 16 22:13:46.420652 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.420573 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerStarted","Data":"098b4188185c2d1e2a80cf19b2cd30cb2971ffc24cc1cfc2182e93505a535761"} Apr 16 22:13:46.429816 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.429777 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kjs6v" event={"ID":"b7a91f2b-d56f-4cd1-86e6-533c19173cab","Type":"ContainerStarted","Data":"c1c1de58108a7b6af82c7e95d76ba64c3ed4af3420d858125803b76711cd3dc6"} Apr 16 22:13:46.448341 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.448296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" event={"ID":"209bb68d-9579-4062-8913-98bd2b4c61a1","Type":"ContainerStarted","Data":"39c5bc069ab97963b7559b7bfb13c97502b12526f557cbc5cded484025c44652"} Apr 16 22:13:46.456985 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.456886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"b3d97f517d68cea7f1429d65f72c889030a4938cd1da9dbf628ba7453b2fc216"} Apr 16 22:13:46.461266 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.461063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r6qrd" event={"ID":"d31d911a-3d25-4b6f-ae92-8ec66384bbbf","Type":"ContainerStarted","Data":"a9001369c698acc0a8a46cb0d15386cde24f6f6199964c6a2bbf74f9cf330704"} Apr 16 22:13:46.465422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.465388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4vrs4" event={"ID":"f4c633be-7a8f-41f5-ba11-b35591028a41","Type":"ContainerStarted","Data":"8ff518371785d892910f32e270de98dc02f28863b8efc9f66a4515c0e5f97e58"} Apr 16 22:13:46.482078 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.481480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwdxm" event={"ID":"43ef0782-3c86-49cc-a384-b383bf10c750","Type":"ContainerStarted","Data":"c12b43b0bd57d793563b87d055865ba98119c415fdc7ed69d26c2464ef530282"} Apr 16 22:13:46.498422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.498376 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:46.614918 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.614830 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:46.872106 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.872022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:46.872249 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.872219 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:46.872288 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.872284 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.872264264 +0000 UTC m=+5.168972706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:46.972478 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:46.972440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:46.972680 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.972626 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:46.972680 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.972650 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:46.972680 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.972663 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.972871 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:46.972722 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.972704062 +0000 UTC m=+5.269412504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.289991 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:47.289864 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:45 +0000 UTC" deadline="2028-01-22 13:06:22.84456705 +0000 UTC" Apr 16 22:13:47.289991 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:47.289903 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15494h52m35.554667161s" Apr 16 22:13:47.367103 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:47.367071 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:47.367349 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:47.367235 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:48.369983 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:48.369948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:48.370433 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.370081 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:48.888570 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:48.887981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:48.888570 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.888135 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:48.888570 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.888217 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.888196308 +0000 UTC m=+9.184904767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:48.988638 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:48.988572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:48.988897 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.988878 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:48.988998 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.988903 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:48.988998 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.988917 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:48.988998 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:48.988991 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.988975667 +0000 UTC m=+9.285684115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.367220 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:49.367137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:49.367389 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:49.367300 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:50.367219 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:50.367123 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:50.367671 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:50.367270 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:51.367169 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:51.367130 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:51.367349 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:51.367278 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:52.367472 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:52.367430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:52.367983 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:52.367595 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:52.924351 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:52.924278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:52.924523 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:52.924450 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.924642 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:52.924529 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:00.924508445 +0000 UTC m=+17.221216884 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:53.025684 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:53.025646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:53.025899 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:53.025850 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:53.025899 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:53.025870 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:53.025899 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:53.025883 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:53.026058 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:53.025942 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.025922694 +0000 UTC m=+17.322631131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:53.367261 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:53.367227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:53.367451 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:53.367384 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:54.367544 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:54.367502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:54.368007 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:54.367631 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:55.367460 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:55.367425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:55.367650 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:55.367555 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:56.367077 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:56.367040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:56.367255 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:56.367169 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:57.366873 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:57.366836 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:57.367358 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:57.366978 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:13:58.367011 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:58.366974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:13:58.367494 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:58.367110 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:13:59.367667 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:13:59.367613 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:13:59.368149 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:13:59.367756 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:00.367118 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:00.367077 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:00.367293 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:00.367221 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:00.979380 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:00.979330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:00.979910 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:00.979489 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:00.979910 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:00.979565 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.979547661 +0000 UTC m=+33.276256096 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:01.080319 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:01.080271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:01.080473 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:01.080458 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:01.080530 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:01.080484 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:01.080530 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:01.080497 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.080594 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:01.080559 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.080538731 +0000 UTC m=+33.377247169 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.367501 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:01.367469 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:01.367708 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:01.367615 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:02.366937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:02.366900 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:02.367433 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:02.367035 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:03.367212 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:03.367167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:03.367556 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:03.367303 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:04.368037 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.367544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:04.368811 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:04.368133 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:04.536272 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.536237 2571 generic.go:358] "Generic (PLEG): container finished" podID="9690e09cc5a9f42f04eae10c4110dd59" containerID="70bb72a162e210de233080f5933846b314390b2a7797f40b6acde9173995ed1e" exitCode=0 Apr 16 22:14:04.536466 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.536305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" event={"ID":"9690e09cc5a9f42f04eae10c4110dd59","Type":"ContainerDied","Data":"70bb72a162e210de233080f5933846b314390b2a7797f40b6acde9173995ed1e"} Apr 16 22:14:04.537626 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.537592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" event={"ID":"47517e74-f8d0-404e-8053-08ab938c2d36","Type":"ContainerStarted","Data":"3c920dc45d25eff9cc997cb3f3cea2f09c04216d56fb8aec3aa5ee69c583e254"} Apr 16 22:14:04.538936 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.538909 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q92dp" event={"ID":"efe2768a-1ae2-4e05-9621-e7a2cb669a2f","Type":"ContainerStarted","Data":"9108d8a2b3dfae982df2d6f7ca4af8337096316766a1a38549ac98f70186161f"} Apr 16 22:14:04.540621 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.540489 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="2cb574377be9733a3174807f6610436e770e3a44105f65967a5c6ef88d7a0a3b" exitCode=0 Apr 16 22:14:04.540621 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.540547 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"2cb574377be9733a3174807f6610436e770e3a44105f65967a5c6ef88d7a0a3b"} Apr 16 22:14:04.542038 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.541998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kjs6v" event={"ID":"b7a91f2b-d56f-4cd1-86e6-533c19173cab","Type":"ContainerStarted","Data":"7223f2c96aba5ea1f4f430f9aed9c0a0b4c4cb150e19154b801c8aed2aee02ae"} Apr 16 22:14:04.543760 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.543741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" event={"ID":"209bb68d-9579-4062-8913-98bd2b4c61a1","Type":"ContainerStarted","Data":"dcd8406053b57601f0818e1aab63f82ca0763744711c0ce9d83e5519bfac8835"} Apr 16 22:14:04.545053 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.545032 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" event={"ID":"7d17133b5a635b239940c82faeb49a51","Type":"ContainerStarted","Data":"4b90c8a371c62ae6bfd4774d2c03d03a240d57a64578217045218f1f4c5f09bb"} Apr 16 22:14:04.547966 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.547943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"05e0e6eb1b398540905e49c48754bbac0435ad13658080e7ebcf034e6cb38b6c"} Apr 16 22:14:04.548042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.547972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"6366dcad28f96fb428fd4a324c5efb6dd201ef6f7156091d8e469162d6fe1bc4"} Apr 16 22:14:04.548042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.547984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"0beacc7e9beee41aeda4f604f6806022a1db7598d41df6a0aea4dd3139f82c9f"} Apr 16 22:14:04.548042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.547993 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"2ca667ce8bf9f103894a1921002a33cb668cd3158a358c8d2a3bdb8d6ff3c6f1"} Apr 16 22:14:04.548042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.548001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"5156fa85ed55113aa204267ba23e0521adce6e02da993811e1416fe11bdb146f"} Apr 16 22:14:04.548042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.548009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"c6ae137f17626368a09ab2a7c931610c2335faa207335afff6dada81f1f9e954"} Apr 16 22:14:04.549141 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.549119 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r6qrd" event={"ID":"d31d911a-3d25-4b6f-ae92-8ec66384bbbf","Type":"ContainerStarted","Data":"2227f4df1e06ecc7a46437fe27c54e36963a06d38edde7c77e229c5eee56363a"} Apr 16 22:14:04.550178 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.550161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwdxm" event={"ID":"43ef0782-3c86-49cc-a384-b383bf10c750","Type":"ContainerStarted","Data":"a2226b45d637d109737ddaaf1d56810c9f3b65e2ae565d1041288bfb89266a0f"} Apr 16 22:14:04.568730 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.568674 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-16.ec2.internal" podStartSLOduration=20.568656055 podStartE2EDuration="20.568656055s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:04.568241269 +0000 UTC m=+20.864949726" watchObservedRunningTime="2026-04-16 22:14:04.568656055 +0000 UTC m=+20.865364516" Apr 16 22:14:04.604169 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.604105 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kjs6v" podStartSLOduration=2.722030989 podStartE2EDuration="20.6040881s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.601071339 +0000 UTC m=+1.897779775" lastFinishedPulling="2026-04-16 22:14:03.48312845 +0000 UTC m=+19.779836886" observedRunningTime="2026-04-16 22:14:04.581878105 +0000 UTC m=+20.878586564" watchObservedRunningTime="2026-04-16 22:14:04.6040881 +0000 UTC m=+20.900796559" Apr 16 22:14:04.623596 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.623549 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6h8wj" podStartSLOduration=2.768630187 podStartE2EDuration="20.623534734s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.630309471 +0000 UTC m=+1.927017906" lastFinishedPulling="2026-04-16 22:14:03.485214 +0000 UTC m=+19.781922453" observedRunningTime="2026-04-16 22:14:04.623213397 +0000 UTC m=+20.919921854" watchObservedRunningTime="2026-04-16 22:14:04.623534734 +0000 UTC m=+20.920243192" Apr 16 22:14:04.637780 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.637722 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nwdxm" podStartSLOduration=2.736293858 podStartE2EDuration="20.63770682s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.582151701 +0000 UTC m=+1.878860137" lastFinishedPulling="2026-04-16 22:14:03.483564659 +0000 UTC m=+19.780273099" observedRunningTime="2026-04-16 22:14:04.63693232 +0000 UTC m=+20.933640779" watchObservedRunningTime="2026-04-16 22:14:04.63770682 +0000 UTC m=+20.934415278" Apr 16 22:14:04.653667 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.653627 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q92dp" podStartSLOduration=2.754891128 podStartE2EDuration="20.653613313s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.61909827 +0000 UTC m=+1.915806919" lastFinishedPulling="2026-04-16 22:14:03.517820666 +0000 UTC m=+19.814529104" observedRunningTime="2026-04-16 22:14:04.653196357 +0000 UTC m=+20.949904817" watchObservedRunningTime="2026-04-16 22:14:04.653613313 +0000 UTC m=+20.950321771" Apr 16 22:14:04.668267 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:04.668222 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r6qrd" podStartSLOduration=2.809941787 podStartE2EDuration="20.66820919s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.625541533 +0000 UTC m=+1.922249969" lastFinishedPulling="2026-04-16 22:14:03.483808935 +0000 UTC m=+19.780517372" observedRunningTime="2026-04-16 22:14:04.667939021 +0000 UTC m=+20.964647480" watchObservedRunningTime="2026-04-16 22:14:04.66820919 +0000 UTC m=+20.964917649" Apr 16 22:14:05.327379 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.327154 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:05.366958 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.366922 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:05.367101 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:05.367063 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:05.553064 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.553019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4vrs4" event={"ID":"f4c633be-7a8f-41f5-ba11-b35591028a41","Type":"ContainerStarted","Data":"7c90b56f0da031a5e682d827f0c9cd361c934f8a47348fdb70c4e0d1431fd0e1"} Apr 16 22:14:05.556175 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.555353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" event={"ID":"9690e09cc5a9f42f04eae10c4110dd59","Type":"ContainerStarted","Data":"d62b1d087a3f70b3a8b74d1868411245de96e81b14b2eb3fee0636da85a1e395"} Apr 16 22:14:05.559950 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.559869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" event={"ID":"209bb68d-9579-4062-8913-98bd2b4c61a1","Type":"ContainerStarted","Data":"45495fbbba067faf84e8fb478409bc1b4951b1ad2466c5befbb2bf466c7b6467"} Apr 16 22:14:05.567987 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.567883 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4vrs4" podStartSLOduration=3.696531701 podStartE2EDuration="21.567867897s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.612256541 +0000 UTC m=+1.908964976" lastFinishedPulling="2026-04-16 22:14:03.483592732 +0000 UTC m=+19.780301172" observedRunningTime="2026-04-16 22:14:05.567654432 +0000 UTC m=+21.864362890" watchObservedRunningTime="2026-04-16 22:14:05.567867897 +0000 UTC m=+21.864576356" Apr 16 22:14:05.586182 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:05.586123 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-16.ec2.internal" podStartSLOduration=21.586105705 podStartE2EDuration="21.586105705s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:05.586091773 +0000 UTC m=+21.882800231" watchObservedRunningTime="2026-04-16 22:14:05.586105705 +0000 UTC m=+21.882814166" Apr 16 22:14:06.086476 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.086438 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:14:06.087652 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.087627 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:14:06.315468 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.315354 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:05.327376338Z","UUID":"b3f83f81-9e05-45bb-8d66-9e2caea22cb3","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:06.317642 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.317598 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:06.317642 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.317650 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:06.367515 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.367459 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:06.367681 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:06.367599 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:06.564631 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.564593 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"f0b55fb63e2ddf10f7333449dfeefcf736cc5a9b7b8b2da2697667717b901013"} Apr 16 22:14:06.566598 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.566564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" event={"ID":"209bb68d-9579-4062-8913-98bd2b4c61a1","Type":"ContainerStarted","Data":"3b4c76ec8a3bcb0794f8d6984b926ae8054113751c0e1f5eea8337f5dbbb5a4d"} Apr 16 22:14:06.566991 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.566964 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:14:06.567546 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.567526 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r6qrd" Apr 16 22:14:06.587111 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:06.587059 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dl6n7" podStartSLOduration=2.008750959 podStartE2EDuration="22.587041722s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.561402562 +0000 UTC m=+1.858111001" lastFinishedPulling="2026-04-16 22:14:06.139693321 +0000 UTC m=+22.436401764" observedRunningTime="2026-04-16 22:14:06.586505728 +0000 UTC m=+22.883214183" watchObservedRunningTime="2026-04-16 22:14:06.587041722 +0000 UTC m=+22.883750180" Apr 16 22:14:07.367019 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:07.366986 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:07.367191 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:07.367128 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:08.367144 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:08.367107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:08.367752 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:08.367231 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:09.367156 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.367117 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:09.367566 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:09.367235 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:09.574335 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.574299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" event={"ID":"a0f30e25-01b8-4134-8184-2a06ef526c62","Type":"ContainerStarted","Data":"d85f526416a5a95a52664d459cb95c391cb7b69a9c8b2718bbcc5dff81d620ff"} Apr 16 22:14:09.574621 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.574590 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:09.576107 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.576079 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="10c629c48bc3d5dfc53834f985d66849ba378d347a01563168cccdc5745c79c5" exitCode=0 Apr 16 22:14:09.576211 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.576121 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"10c629c48bc3d5dfc53834f985d66849ba378d347a01563168cccdc5745c79c5"} Apr 16 22:14:09.589977 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.589953 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:09.600742 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:09.600698 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" podStartSLOduration=7.593110166 podStartE2EDuration="25.600683851s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.63589311 +0000 UTC m=+1.932601547" lastFinishedPulling="2026-04-16 22:14:03.643466797 +0000 UTC m=+19.940175232" observedRunningTime="2026-04-16 22:14:09.599484751 +0000 UTC m=+25.896193208" watchObservedRunningTime="2026-04-16 22:14:09.600683851 +0000 UTC m=+25.897392336" Apr 16 22:14:10.371702 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.371666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:10.372101 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:10.371841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:10.580335 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.580088 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="c0e6f757192eff42a171e83d04d776e9de4df4db7dea6a478ded7e0686de2448" exitCode=0 Apr 16 22:14:10.580492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.580170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"c0e6f757192eff42a171e83d04d776e9de4df4db7dea6a478ded7e0686de2448"} Apr 16 22:14:10.581333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.580784 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:10.581333 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.580813 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:10.596429 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.596396 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:10.606442 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.606408 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gzcjz"] Apr 16 22:14:10.606645 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.606527 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:10.606645 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:10.606616 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:10.607276 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.607245 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-89m59"] Apr 16 22:14:10.607382 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:10.607368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:10.607471 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:10.607451 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:11.584579 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:11.584478 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="fb00dea7bf871fb688ab6c348a3b34e44c5b3cf35f05ad3a6b139689fa51c048" exitCode=0 Apr 16 22:14:11.585182 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:11.584568 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"fb00dea7bf871fb688ab6c348a3b34e44c5b3cf35f05ad3a6b139689fa51c048"} Apr 16 22:14:12.367216 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:12.367180 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:12.367400 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:12.367317 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:12.367400 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:12.367373 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:12.367509 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:12.367484 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:14.368114 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:14.368076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:14.368747 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:14.368161 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:14.368747 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:14.368249 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:14.368747 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:14.368387 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:16.367300 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.367258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:16.367723 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.367283 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:16.367723 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.367406 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gzcjz" podUID="8e076f04-d3ed-41ce-ade7-ecc7c991025a" Apr 16 22:14:16.367723 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.367488 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:14:16.520567 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.520528 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-16.ec2.internal" event="NodeReady" Apr 16 22:14:16.520742 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.520700 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:16.564969 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.564933 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7tmv5"] Apr 16 22:14:16.589344 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.589301 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m4s6c"] Apr 16 22:14:16.589519 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.589484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.592330 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.592167 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:14:16.592330 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.592203 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:16.592542 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.592524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:16.602199 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.602021 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7tmv5"] Apr 16 22:14:16.602319 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.602210 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4s6c"] Apr 16 22:14:16.602319 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.602175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.604952 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.604840 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:16.604952 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.604934 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:16.605144 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.604996 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:16.605144 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.605092 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:14:16.694145 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.694145 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.694145 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e30781c3-7044-4751-961b-48bf5ce84af8-tmp-dir\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.694430 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30781c3-7044-4751-961b-48bf5ce84af8-config-volume\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.694430 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzf2l\" (UniqueName: \"kubernetes.io/projected/e1c9f736-046e-47ad-a262-894ae7c4f9b6-kube-api-access-vzf2l\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.694430 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.694375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvzm\" (UniqueName: \"kubernetes.io/projected/e30781c3-7044-4751-961b-48bf5ce84af8-kube-api-access-fkvzm\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.795565 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvzm\" (UniqueName: \"kubernetes.io/projected/e30781c3-7044-4751-961b-48bf5ce84af8-kube-api-access-fkvzm\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e30781c3-7044-4751-961b-48bf5ce84af8-tmp-dir\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30781c3-7044-4751-961b-48bf5ce84af8-config-volume\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.795701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzf2l\" (UniqueName: \"kubernetes.io/projected/e1c9f736-046e-47ad-a262-894ae7c4f9b6-kube-api-access-vzf2l\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.795754 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.795732 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:16.796103 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.795732 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:16.796103 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.795848 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.295828155 +0000 UTC m=+33.592536599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:16.796103 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.795886 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.295867745 +0000 UTC m=+33.592576195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:16.796103 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.796090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e30781c3-7044-4751-961b-48bf5ce84af8-tmp-dir\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.796448 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.796426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e30781c3-7044-4751-961b-48bf5ce84af8-config-volume\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.808370 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.808344 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvzm\" (UniqueName: \"kubernetes.io/projected/e30781c3-7044-4751-961b-48bf5ce84af8-kube-api-access-fkvzm\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:16.808501 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.808443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzf2l\" (UniqueName: \"kubernetes.io/projected/e1c9f736-046e-47ad-a262-894ae7c4f9b6-kube-api-access-vzf2l\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:16.997275 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:16.997143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:16.997449 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.997293 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:16.997449 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:16.997366 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:48.997346979 +0000 UTC m=+65.294055420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:17.098170 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:17.098130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:17.098338 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.098252 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:17.098338 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.098269 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:17.098338 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.098278 2571 projected.go:194] Error preparing data for projected volume kube-api-access-ckdpn for pod openshift-network-diagnostics/network-check-target-gzcjz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:17.098338 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.098338 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn podName:8e076f04-d3ed-41ce-ade7-ecc7c991025a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:49.098324052 +0000 UTC m=+65.395032487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ckdpn" (UniqueName: "kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn") pod "network-check-target-gzcjz" (UID: "8e076f04-d3ed-41ce-ade7-ecc7c991025a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:17.299008 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:17.298920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:17.299008 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:17.298955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:17.299229 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.299065 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:17.299229 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.299127 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.299113641 +0000 UTC m=+34.595822077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:17.299229 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.299068 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:17.299229 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:17.299193 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:18.299179639 +0000 UTC m=+34.595888075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:17.601058 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:17.601022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerStarted","Data":"a2cdc362358ea151ebd286b44e4ff34fa08ed958930a38b7a47467ef008baeb2"} Apr 16 22:14:18.307284 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.307242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:18.307284 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.307286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:18.307499 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:18.307392 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:18.307499 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:18.307459 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.307443151 +0000 UTC m=+36.604151587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.307499 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:18.307398 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:18.307615 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:18.307519 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.307507063 +0000 UTC m=+36.604215499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:18.367755 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.367714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:18.368008 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.367964 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:18.371685 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.371659 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:18.371837 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.371800 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:18.371837 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.371824 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:18.371964 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.371848 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:14:18.371964 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.371797 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:18.604758 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.604727 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="a2cdc362358ea151ebd286b44e4ff34fa08ed958930a38b7a47467ef008baeb2" exitCode=0 Apr 16 22:14:18.605241 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:18.604795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"a2cdc362358ea151ebd286b44e4ff34fa08ed958930a38b7a47467ef008baeb2"} Apr 16 22:14:19.609556 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:19.609520 2571 generic.go:358] "Generic (PLEG): container finished" podID="41a45a8e-91e8-4a52-90bd-9fddbbac91d0" containerID="83b5a3157c5213acac64c079f8d8bd04200659056f060e2bcfc74886d33037a3" exitCode=0 Apr 16 22:14:19.609556 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:19.609561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerDied","Data":"83b5a3157c5213acac64c079f8d8bd04200659056f060e2bcfc74886d33037a3"} Apr 16 22:14:20.319317 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:20.319280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:20.319317 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:20.319316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:20.319527 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:20.319435 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:20.319527 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:20.319496 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:24.319482079 +0000 UTC m=+40.616190518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:20.319599 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:20.319435 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:20.319599 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:20.319564 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:24.319551 +0000 UTC m=+40.616259455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:20.617031 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:20.616994 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" event={"ID":"41a45a8e-91e8-4a52-90bd-9fddbbac91d0","Type":"ContainerStarted","Data":"c985e7c11eb710755a4ea7280afab0b4dec6b6c2990032ca8a010013d1f68e40"} Apr 16 22:14:20.640052 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:20.639994 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jdp2m" podStartSLOduration=4.81978942 podStartE2EDuration="36.639979339s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:13:45.60759827 +0000 UTC m=+1.904306707" lastFinishedPulling="2026-04-16 22:14:17.427788187 +0000 UTC m=+33.724496626" observedRunningTime="2026-04-16 22:14:20.638702918 +0000 UTC m=+36.935411376" watchObservedRunningTime="2026-04-16 22:14:20.639979339 +0000 UTC m=+36.936687796" Apr 16 22:14:24.349042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:24.349002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:24.349042 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:24.349044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:24.349472 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:24.349149 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:24.349472 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:24.349160 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:24.349472 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:24.349203 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:32.349189516 +0000 UTC m=+48.645897953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:24.349472 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:24.349217 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:32.349210036 +0000 UTC m=+48.645918473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:32.403520 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.403471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:32.403520 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.403517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:32.404061 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:32.403620 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:32.404061 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:32.403688 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:48.403672848 +0000 UTC m=+64.700381284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:32.404061 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:32.403620 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:32.404061 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:32.403749 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:48.403737077 +0000 UTC m=+64.700445528 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:32.660070 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.659980 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z"] Apr 16 22:14:32.676693 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.676647 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6"] Apr 16 22:14:32.676870 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.676704 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.679441 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.679412 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:32.679441 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.679428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:32.679621 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.679492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:32.680513 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.680494 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 22:14:32.680616 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.680557 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-b4kjs\"" Apr 16 22:14:32.693523 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.693494 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z"] Apr 16 22:14:32.693523 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.693530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6"] Apr 16 22:14:32.693714 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.693679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.697172 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.696990 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 22:14:32.698535 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.698507 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 22:14:32.699184 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.699159 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 22:14:32.699431 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.699158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 22:14:32.805956 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.805917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.805984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmqx\" (UniqueName: \"kubernetes.io/projected/f9b2f64e-a3d3-4c02-b799-6b30990d7566-kube-api-access-srmqx\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9l7\" (UniqueName: \"kubernetes.io/projected/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-kube-api-access-mw9l7\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806109 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.806136 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.806136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907436 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srmqx\" (UniqueName: \"kubernetes.io/projected/f9b2f64e-a3d3-4c02-b799-6b30990d7566-kube-api-access-srmqx\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9l7\" (UniqueName: \"kubernetes.io/projected/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-kube-api-access-mw9l7\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.907620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.907945 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.907680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.908649 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.908625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.911332 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.911276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.911455 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.911359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-ca\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.911455 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.911367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.911529 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.911509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f9b2f64e-a3d3-4c02-b799-6b30990d7566-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.911529 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.911514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.916354 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.916326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9l7\" (UniqueName: \"kubernetes.io/projected/59a4219d-b4ed-41ed-9716-c7eeef4e6e46-kube-api-access-mw9l7\") pod \"managed-serviceaccount-addon-agent-6466454749-kb29z\" (UID: \"59a4219d-b4ed-41ed-9716-c7eeef4e6e46\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:32.916461 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.916400 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmqx\" (UniqueName: \"kubernetes.io/projected/f9b2f64e-a3d3-4c02-b799-6b30990d7566-kube-api-access-srmqx\") pod \"cluster-proxy-proxy-agent-676b766678-sblk6\" (UID: \"f9b2f64e-a3d3-4c02-b799-6b30990d7566\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:32.996392 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:32.996358 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" Apr 16 22:14:33.005206 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:33.005173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:14:33.198694 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:33.198598 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z"] Apr 16 22:14:33.202257 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:14:33.202227 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a4219d_b4ed_41ed_9716_c7eeef4e6e46.slice/crio-59cae557c0606a3e4396e5d87713606b0f3dfdf3660c865220bb72dcd69a2f00 WatchSource:0}: Error finding container 59cae557c0606a3e4396e5d87713606b0f3dfdf3660c865220bb72dcd69a2f00: Status 404 returned error can't find the container with id 59cae557c0606a3e4396e5d87713606b0f3dfdf3660c865220bb72dcd69a2f00 Apr 16 22:14:33.202448 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:33.202429 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6"] Apr 16 22:14:33.205188 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:14:33.205157 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b2f64e_a3d3_4c02_b799_6b30990d7566.slice/crio-5baa8e0d7a84da4982190119cb671f02052540f58f21563151eed15c4073ea88 WatchSource:0}: Error finding container 5baa8e0d7a84da4982190119cb671f02052540f58f21563151eed15c4073ea88: Status 404 returned error can't find the container with id 5baa8e0d7a84da4982190119cb671f02052540f58f21563151eed15c4073ea88 Apr 16 22:14:33.641936 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:33.641895 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerStarted","Data":"5baa8e0d7a84da4982190119cb671f02052540f58f21563151eed15c4073ea88"} Apr 16 22:14:33.642872 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:33.642850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" event={"ID":"59a4219d-b4ed-41ed-9716-c7eeef4e6e46","Type":"ContainerStarted","Data":"59cae557c0606a3e4396e5d87713606b0f3dfdf3660c865220bb72dcd69a2f00"} Apr 16 22:14:36.651069 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:36.650974 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerStarted","Data":"a148c8d2b25a84afa44f1a4ac54dfa4c0b35bd6a30f278112043886739ce8180"} Apr 16 22:14:36.652175 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:36.652151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" event={"ID":"59a4219d-b4ed-41ed-9716-c7eeef4e6e46","Type":"ContainerStarted","Data":"d0058e422c91d0698e3fd3bbfcf6977434201d9f84ec624fab52741010ffd892"} Apr 16 22:14:36.669140 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:36.669077 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" podStartSLOduration=1.503954168 podStartE2EDuration="4.669056385s" podCreationTimestamp="2026-04-16 22:14:32 +0000 UTC" firstStartedPulling="2026-04-16 22:14:33.204367155 +0000 UTC m=+49.501075591" lastFinishedPulling="2026-04-16 22:14:36.369469371 +0000 UTC m=+52.666177808" observedRunningTime="2026-04-16 22:14:36.668439107 +0000 UTC m=+52.965147565" watchObservedRunningTime="2026-04-16 22:14:36.669056385 +0000 UTC m=+52.965764844" Apr 16 22:14:38.658248 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:38.658157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerStarted","Data":"d879c4ef38b753d84cfddad714446e8243474fa72fdb42977ce643b2b758cf38"} Apr 16 22:14:38.658248 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:38.658197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerStarted","Data":"8dbaa44a928f4de4574ace5553d3d2c2765d11ee0300e79cd3c4231fb0232cbe"} Apr 16 22:14:38.676936 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:38.676883 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" podStartSLOduration=1.496813658 podStartE2EDuration="6.676869904s" podCreationTimestamp="2026-04-16 22:14:32 +0000 UTC" firstStartedPulling="2026-04-16 22:14:33.222263463 +0000 UTC m=+49.518971903" lastFinishedPulling="2026-04-16 22:14:38.40231971 +0000 UTC m=+54.699028149" observedRunningTime="2026-04-16 22:14:38.675513076 +0000 UTC m=+54.972221526" watchObservedRunningTime="2026-04-16 22:14:38.676869904 +0000 UTC m=+54.973578359" Apr 16 22:14:42.597354 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:42.597325 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-88hfq" Apr 16 22:14:48.431081 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:48.431038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:14:48.431081 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:48.431080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:14:48.431522 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:48.431195 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:48.431522 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:48.431198 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:48.431522 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:48.431271 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:20.431254207 +0000 UTC m=+96.727962644 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:14:48.431522 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:48.431287 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:20.43128027 +0000 UTC m=+96.727988706 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:14:49.035129 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.035084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:14:49.038038 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.038018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:49.046199 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:49.046172 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:49.046327 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:14:49.046247 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:53.046226715 +0000 UTC m=+129.342935169 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : secret "metrics-daemon-secret" not found Apr 16 22:14:49.136201 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.136151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:49.138999 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.138978 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:49.149503 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.149482 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:49.160178 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.160147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdpn\" (UniqueName: \"kubernetes.io/projected/8e076f04-d3ed-41ce-ade7-ecc7c991025a-kube-api-access-ckdpn\") pod \"network-check-target-gzcjz\" (UID: \"8e076f04-d3ed-41ce-ade7-ecc7c991025a\") " pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:49.280627 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.280593 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lt98\"" Apr 16 22:14:49.288785 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.288698 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:49.403691 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.403659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gzcjz"] Apr 16 22:14:49.406964 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:14:49.406929 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e076f04_d3ed_41ce_ade7_ecc7c991025a.slice/crio-1929ce493f3e30abb851bc4274c3d322d44bafb60bacab6b660dd58162827f52 WatchSource:0}: Error finding container 1929ce493f3e30abb851bc4274c3d322d44bafb60bacab6b660dd58162827f52: Status 404 returned error can't find the container with id 1929ce493f3e30abb851bc4274c3d322d44bafb60bacab6b660dd58162827f52 Apr 16 22:14:49.681987 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:49.681954 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gzcjz" event={"ID":"8e076f04-d3ed-41ce-ade7-ecc7c991025a","Type":"ContainerStarted","Data":"1929ce493f3e30abb851bc4274c3d322d44bafb60bacab6b660dd58162827f52"} Apr 16 22:14:52.690044 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:52.690003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gzcjz" event={"ID":"8e076f04-d3ed-41ce-ade7-ecc7c991025a","Type":"ContainerStarted","Data":"a9c8ce2ea810b58fa7eb0b1a9522c40722ef6f5c7f85fb3727510efd86aa8f4a"} Apr 16 22:14:52.690498 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:52.690165 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:14:52.707713 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:14:52.707660 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gzcjz" podStartSLOduration=66.125409773 podStartE2EDuration="1m8.707645039s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:14:49.408721988 +0000 UTC m=+65.705430425" lastFinishedPulling="2026-04-16 22:14:51.990957238 +0000 UTC m=+68.287665691" observedRunningTime="2026-04-16 22:14:52.706679078 +0000 UTC m=+69.003387536" watchObservedRunningTime="2026-04-16 22:14:52.707645039 +0000 UTC m=+69.004353497" Apr 16 22:15:20.460225 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:15:20.460118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:15:20.460225 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:15:20.460173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:15:20.460678 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:20.460265 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:20.460678 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:20.460341 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls podName:e30781c3-7044-4751-961b-48bf5ce84af8 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:24.46032297 +0000 UTC m=+160.757031405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls") pod "dns-default-7tmv5" (UID: "e30781c3-7044-4751-961b-48bf5ce84af8") : secret "dns-default-metrics-tls" not found Apr 16 22:15:20.460678 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:20.460272 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:20.460678 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:20.460406 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert podName:e1c9f736-046e-47ad-a262-894ae7c4f9b6 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:24.460393906 +0000 UTC m=+160.757102342 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert") pod "ingress-canary-m4s6c" (UID: "e1c9f736-046e-47ad-a262-894ae7c4f9b6") : secret "canary-serving-cert" not found Apr 16 22:15:23.694722 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:15:23.694692 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gzcjz" Apr 16 22:15:53.102484 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:15:53.102431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:15:53.102996 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:53.102600 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:53.102996 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:15:53.102677 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs podName:3eb950f1-309c-40b0-a9b2-13d46c7e2d60 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:55.102656952 +0000 UTC m=+251.399365408 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs") pod "network-metrics-daemon-89m59" (UID: "3eb950f1-309c-40b0-a9b2-13d46c7e2d60") : secret "metrics-daemon-secret" not found Apr 16 22:16:03.779544 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:03.779514 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nwdxm_43ef0782-3c86-49cc-a384-b383bf10c750/dns-node-resolver/0.log" Apr 16 22:16:04.378450 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:04.378418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kjs6v_b7a91f2b-d56f-4cd1-86e6-533c19173cab/node-ca/0.log" Apr 16 22:16:13.006574 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:13.006517 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" podUID="f9b2f64e-a3d3-4c02-b799-6b30990d7566" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:16:19.600900 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:16:19.600860 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7tmv5" podUID="e30781c3-7044-4751-961b-48bf5ce84af8" Apr 16 22:16:19.613104 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:16:19.613050 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-m4s6c" podUID="e1c9f736-046e-47ad-a262-894ae7c4f9b6" Apr 16 22:16:19.854061 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:19.853975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:16:19.854241 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:19.853976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:21.382542 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:16:21.382494 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-89m59" podUID="3eb950f1-309c-40b0-a9b2-13d46c7e2d60" Apr 16 22:16:23.006927 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:23.006882 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" podUID="f9b2f64e-a3d3-4c02-b799-6b30990d7566" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:16:24.529525 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.529479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:24.529525 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.529524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:16:24.531915 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.531893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e30781c3-7044-4751-961b-48bf5ce84af8-metrics-tls\") pod \"dns-default-7tmv5\" (UID: \"e30781c3-7044-4751-961b-48bf5ce84af8\") " pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:24.532016 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.531997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1c9f736-046e-47ad-a262-894ae7c4f9b6-cert\") pod \"ingress-canary-m4s6c\" (UID: \"e1c9f736-046e-47ad-a262-894ae7c4f9b6\") " pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:16:24.657606 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.657571 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6fpb6\"" Apr 16 22:16:24.658581 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.658562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bd46\"" Apr 16 22:16:24.665576 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.665541 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4s6c" Apr 16 22:16:24.665725 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.665639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:24.796531 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.796453 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4s6c"] Apr 16 22:16:24.800076 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:24.800047 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c9f736_046e_47ad_a262_894ae7c4f9b6.slice/crio-c972ad7609a20fe59840dfb7251d62477570f14a61a45fe91a07dd0c2fcc9c9e WatchSource:0}: Error finding container c972ad7609a20fe59840dfb7251d62477570f14a61a45fe91a07dd0c2fcc9c9e: Status 404 returned error can't find the container with id c972ad7609a20fe59840dfb7251d62477570f14a61a45fe91a07dd0c2fcc9c9e Apr 16 22:16:24.812602 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.812574 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7tmv5"] Apr 16 22:16:24.816029 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:24.815996 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode30781c3_7044_4751_961b_48bf5ce84af8.slice/crio-27c63e15b7f9410057726a728a88260aecaf280051618ac6ed06244be96c4c3c WatchSource:0}: Error finding container 27c63e15b7f9410057726a728a88260aecaf280051618ac6ed06244be96c4c3c: Status 404 returned error can't find the container with id 27c63e15b7f9410057726a728a88260aecaf280051618ac6ed06244be96c4c3c Apr 16 22:16:24.865109 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.865070 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4s6c" event={"ID":"e1c9f736-046e-47ad-a262-894ae7c4f9b6","Type":"ContainerStarted","Data":"c972ad7609a20fe59840dfb7251d62477570f14a61a45fe91a07dd0c2fcc9c9e"} Apr 16 22:16:24.865948 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:24.865920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7tmv5" event={"ID":"e30781c3-7044-4751-961b-48bf5ce84af8","Type":"ContainerStarted","Data":"27c63e15b7f9410057726a728a88260aecaf280051618ac6ed06244be96c4c3c"} Apr 16 22:16:25.372247 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.371994 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-t9crn"] Apr 16 22:16:25.375368 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.375290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.379027 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.378995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.380601 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.380420 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.383465 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.382741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bpg4g\"" Apr 16 22:16:25.383465 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.383092 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.383465 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.383152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.395589 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.395557 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b5ff79494-smjn6"] Apr 16 22:16:25.398779 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.398737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t9crn"] Apr 16 22:16:25.398914 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.398893 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.402864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.402834 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:16:25.403173 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.403095 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:16:25.403294 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.403201 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2kcsr\"" Apr 16 22:16:25.405541 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.405353 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:16:25.411127 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.411103 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:16:25.433669 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.433630 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5sv8\" (UniqueName: \"kubernetes.io/projected/0830df32-674c-4f25-b525-e9b82449155d-kube-api-access-r5sv8\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.433915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b5ff79494-smjn6"] Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.433956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0830df32-674c-4f25-b525-e9b82449155d-data-volume\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.433990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-registry-tls\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434013 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-registry-certificates\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434100 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0830df32-674c-4f25-b525-e9b82449155d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-trusted-ca\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jxl\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-kube-api-access-48jxl\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0830df32-674c-4f25-b525-e9b82449155d-crio-socket\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-installation-pull-secrets\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-bound-sa-token\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-image-registry-private-configuration\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0830df32-674c-4f25-b525-e9b82449155d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.434422 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.434325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2392c624-be45-4127-b416-d45610367c07-ca-trust-extracted\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535415 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-image-registry-private-configuration\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535432 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0830df32-674c-4f25-b525-e9b82449155d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2392c624-be45-4127-b416-d45610367c07-ca-trust-extracted\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5sv8\" (UniqueName: \"kubernetes.io/projected/0830df32-674c-4f25-b525-e9b82449155d-kube-api-access-r5sv8\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0830df32-674c-4f25-b525-e9b82449155d-data-volume\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-registry-tls\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-registry-certificates\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0830df32-674c-4f25-b525-e9b82449155d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-trusted-ca\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48jxl\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-kube-api-access-48jxl\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0830df32-674c-4f25-b525-e9b82449155d-crio-socket\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-installation-pull-secrets\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.535882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.535812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-bound-sa-token\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.536452 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.536361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2392c624-be45-4127-b416-d45610367c07-ca-trust-extracted\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.536452 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.536441 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0830df32-674c-4f25-b525-e9b82449155d-data-volume\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.536555 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.536530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0830df32-674c-4f25-b525-e9b82449155d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.537006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.536687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-registry-certificates\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.537006 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.536739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0830df32-674c-4f25-b525-e9b82449155d-crio-socket\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.537499 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.537475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2392c624-be45-4127-b416-d45610367c07-trusted-ca\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.540438 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.540388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0830df32-674c-4f25-b525-e9b82449155d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.540438 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.540408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-installation-pull-secrets\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.540603 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.540517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-registry-tls\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.540702 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.540676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2392c624-be45-4127-b416-d45610367c07-image-registry-private-configuration\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.547520 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.547411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5sv8\" (UniqueName: \"kubernetes.io/projected/0830df32-674c-4f25-b525-e9b82449155d-kube-api-access-r5sv8\") pod \"insights-runtime-extractor-t9crn\" (UID: \"0830df32-674c-4f25-b525-e9b82449155d\") " pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.548176 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.548151 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jxl\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-kube-api-access-48jxl\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.562457 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.562425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2392c624-be45-4127-b416-d45610367c07-bound-sa-token\") pod \"image-registry-6b5ff79494-smjn6\" (UID: \"2392c624-be45-4127-b416-d45610367c07\") " pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.687632 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.687536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t9crn" Apr 16 22:16:25.712882 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.712846 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:25.897196 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.897141 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b5ff79494-smjn6"] Apr 16 22:16:25.908079 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:25.908045 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t9crn"] Apr 16 22:16:26.644076 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:26.644037 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2392c624_be45_4127_b416_d45610367c07.slice/crio-9094bc0f54bc46d9ab7bc603c4b8fb4eb5de1ed7fc4d383bed3626bd52b574e1 WatchSource:0}: Error finding container 9094bc0f54bc46d9ab7bc603c4b8fb4eb5de1ed7fc4d383bed3626bd52b574e1: Status 404 returned error can't find the container with id 9094bc0f54bc46d9ab7bc603c4b8fb4eb5de1ed7fc4d383bed3626bd52b574e1 Apr 16 22:16:26.644911 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:26.644884 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0830df32_674c_4f25_b525_e9b82449155d.slice/crio-32e4ea1253a8e454f61a1ae1087cd49ef5a6e9b4dbaa183eefd6d2da8d00dbea WatchSource:0}: Error finding container 32e4ea1253a8e454f61a1ae1087cd49ef5a6e9b4dbaa183eefd6d2da8d00dbea: Status 404 returned error can't find the container with id 32e4ea1253a8e454f61a1ae1087cd49ef5a6e9b4dbaa183eefd6d2da8d00dbea Apr 16 22:16:26.874342 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.874285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7tmv5" event={"ID":"e30781c3-7044-4751-961b-48bf5ce84af8","Type":"ContainerStarted","Data":"eb331db4d5b145b9829153fb94c6989f04339af3a7b5cae7c7337b4c1f1152a0"} Apr 16 22:16:26.876248 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.876172 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9crn" event={"ID":"0830df32-674c-4f25-b525-e9b82449155d","Type":"ContainerStarted","Data":"34011e60a6a5559ffeafed0e2cb5309e356cfd719d9742b907e7d718aaeb5eef"} Apr 16 22:16:26.876248 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.876213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9crn" event={"ID":"0830df32-674c-4f25-b525-e9b82449155d","Type":"ContainerStarted","Data":"32e4ea1253a8e454f61a1ae1087cd49ef5a6e9b4dbaa183eefd6d2da8d00dbea"} Apr 16 22:16:26.878082 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.878032 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" event={"ID":"2392c624-be45-4127-b416-d45610367c07","Type":"ContainerStarted","Data":"300c255621cd99ce01d17067eed4c1e5f2e3cb12c84f0bb6e92f7a9f02daf813"} Apr 16 22:16:26.878082 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.878061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" event={"ID":"2392c624-be45-4127-b416-d45610367c07","Type":"ContainerStarted","Data":"9094bc0f54bc46d9ab7bc603c4b8fb4eb5de1ed7fc4d383bed3626bd52b574e1"} Apr 16 22:16:26.878395 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.878272 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:26.879697 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.879658 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4s6c" event={"ID":"e1c9f736-046e-47ad-a262-894ae7c4f9b6","Type":"ContainerStarted","Data":"cff0472c3f3aed84b6023538d5d7ff6b7bc29a4c7e8004b8e7f8c931ac3b8ebf"} Apr 16 22:16:26.900232 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.900119 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" podStartSLOduration=1.9001014710000002 podStartE2EDuration="1.900101471s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:26.898746953 +0000 UTC m=+163.195455410" watchObservedRunningTime="2026-04-16 22:16:26.900101471 +0000 UTC m=+163.196809929" Apr 16 22:16:26.915063 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:26.914971 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m4s6c" podStartSLOduration=129.013721782 podStartE2EDuration="2m10.914951729s" podCreationTimestamp="2026-04-16 22:14:16 +0000 UTC" firstStartedPulling="2026-04-16 22:16:24.801912922 +0000 UTC m=+161.098621361" lastFinishedPulling="2026-04-16 22:16:26.703142871 +0000 UTC m=+162.999851308" observedRunningTime="2026-04-16 22:16:26.913888092 +0000 UTC m=+163.210596550" watchObservedRunningTime="2026-04-16 22:16:26.914951729 +0000 UTC m=+163.211660191" Apr 16 22:16:27.884304 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:27.884263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7tmv5" event={"ID":"e30781c3-7044-4751-961b-48bf5ce84af8","Type":"ContainerStarted","Data":"575d7e75cad387fd3cd9513738cea2e1040c7065e7533dea996b658583b2e352"} Apr 16 22:16:27.884811 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:27.884395 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:27.886002 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:27.885972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9crn" event={"ID":"0830df32-674c-4f25-b525-e9b82449155d","Type":"ContainerStarted","Data":"444362a24598ff5b52fae629110d22d9ad76d0f2d37b54d7b7d9f5207eae92a2"} Apr 16 22:16:27.902466 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:27.902398 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7tmv5" podStartSLOduration=130.024090405 podStartE2EDuration="2m11.902377656s" podCreationTimestamp="2026-04-16 22:14:16 +0000 UTC" firstStartedPulling="2026-04-16 22:16:24.817706516 +0000 UTC m=+161.114414955" lastFinishedPulling="2026-04-16 22:16:26.695993766 +0000 UTC m=+162.992702206" observedRunningTime="2026-04-16 22:16:27.90086195 +0000 UTC m=+164.197570410" watchObservedRunningTime="2026-04-16 22:16:27.902377656 +0000 UTC m=+164.199086096" Apr 16 22:16:28.890589 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:28.890551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t9crn" event={"ID":"0830df32-674c-4f25-b525-e9b82449155d","Type":"ContainerStarted","Data":"1da0eddb53cb1882348d876e67ae0b0e02d9f71ec0af63d3211b0796aefa118b"} Apr 16 22:16:28.911219 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:28.911167 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-t9crn" podStartSLOduration=2.002238292 podStartE2EDuration="3.911151809s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.767863078 +0000 UTC m=+163.064571524" lastFinishedPulling="2026-04-16 22:16:28.676776604 +0000 UTC m=+164.973485041" observedRunningTime="2026-04-16 22:16:28.909427793 +0000 UTC m=+165.206136250" watchObservedRunningTime="2026-04-16 22:16:28.911151809 +0000 UTC m=+165.207860266" Apr 16 22:16:33.006037 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.005948 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" podUID="f9b2f64e-a3d3-4c02-b799-6b30990d7566" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:16:33.006037 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.006022 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" Apr 16 22:16:33.006525 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.006478 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"d879c4ef38b753d84cfddad714446e8243474fa72fdb42977ce643b2b758cf38"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 22:16:33.006566 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.006526 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" podUID="f9b2f64e-a3d3-4c02-b799-6b30990d7566" containerName="service-proxy" containerID="cri-o://d879c4ef38b753d84cfddad714446e8243474fa72fdb42977ce643b2b758cf38" gracePeriod=30 Apr 16 22:16:33.904212 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.904174 2571 generic.go:358] "Generic (PLEG): container finished" podID="f9b2f64e-a3d3-4c02-b799-6b30990d7566" containerID="d879c4ef38b753d84cfddad714446e8243474fa72fdb42977ce643b2b758cf38" exitCode=2 Apr 16 22:16:33.904212 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.904222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerDied","Data":"d879c4ef38b753d84cfddad714446e8243474fa72fdb42977ce643b2b758cf38"} Apr 16 22:16:33.904428 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:33.904247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-676b766678-sblk6" event={"ID":"f9b2f64e-a3d3-4c02-b799-6b30990d7566","Type":"ContainerStarted","Data":"fe69d7d1d6e1ea10ccbf216002d4d238cace21c694fa80d4d9447bbc3a10b529"} Apr 16 22:16:35.367504 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:35.367455 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:16:36.913098 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:36.913066 2571 generic.go:358] "Generic (PLEG): container finished" podID="59a4219d-b4ed-41ed-9716-c7eeef4e6e46" containerID="d0058e422c91d0698e3fd3bbfcf6977434201d9f84ec624fab52741010ffd892" exitCode=255 Apr 16 22:16:36.913492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:36.913131 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" event={"ID":"59a4219d-b4ed-41ed-9716-c7eeef4e6e46","Type":"ContainerDied","Data":"d0058e422c91d0698e3fd3bbfcf6977434201d9f84ec624fab52741010ffd892"} Apr 16 22:16:36.913492 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:36.913454 2571 scope.go:117] "RemoveContainer" containerID="d0058e422c91d0698e3fd3bbfcf6977434201d9f84ec624fab52741010ffd892" Apr 16 22:16:37.893355 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:37.893319 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7tmv5" Apr 16 22:16:37.918066 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:37.918032 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6466454749-kb29z" event={"ID":"59a4219d-b4ed-41ed-9716-c7eeef4e6e46","Type":"ContainerStarted","Data":"451af0528435ee20ab4168ab81a4f0ad350d0f75e496548fe7ad079a8baab6f8"} Apr 16 22:16:40.070897 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.070862 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tgnzk"] Apr 16 22:16:40.073974 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.073955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.080020 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.079995 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:40.080710 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.080438 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:40.080710 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.080438 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:40.081512 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.081493 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:40.081610 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.081537 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:40.081610 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.081545 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gp7ks\"" Apr 16 22:16:40.082228 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.082211 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:40.140937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.140886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-root\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.140937 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.140936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141151 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-textfile\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141151 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141151 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrd8t\" (UniqueName: \"kubernetes.io/projected/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-kube-api-access-nrd8t\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141151 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-wtmp\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141295 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141161 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-metrics-client-ca\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141295 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.141295 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.141210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-sys\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241651 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrd8t\" (UniqueName: \"kubernetes.io/projected/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-kube-api-access-nrd8t\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-wtmp\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-metrics-client-ca\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-sys\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.241846 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-sys\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-root\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-wtmp\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-textfile\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242021 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.241966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-root\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242219 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:16:40.242073 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:40.242219 ip-10-0-130-16 kubenswrapper[2571]: E0416 22:16:40.242151 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls podName:cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:40.742130687 +0000 UTC m=+177.038839132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls") pod "node-exporter-tgnzk" (UID: "cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079") : secret "node-exporter-tls" not found Apr 16 22:16:40.242289 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.242224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-textfile\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242372 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.242353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-metrics-client-ca\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.242409 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.242391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-accelerators-collector-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.244227 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.244210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.255348 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.255318 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrd8t\" (UniqueName: \"kubernetes.io/projected/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-kube-api-access-nrd8t\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.745287 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.745245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.747537 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.747505 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079-node-exporter-tls\") pod \"node-exporter-tgnzk\" (UID: \"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079\") " pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.983934 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:40.983883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tgnzk" Apr 16 22:16:40.993742 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:40.993705 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb7d7efd_7ddb_49c8_9e32_c6f52cd9e079.slice/crio-589d9de66d1692031f0b22d6ef0050da0fdbed67d6e0728be5aece185abc643b WatchSource:0}: Error finding container 589d9de66d1692031f0b22d6ef0050da0fdbed67d6e0728be5aece185abc643b: Status 404 returned error can't find the container with id 589d9de66d1692031f0b22d6ef0050da0fdbed67d6e0728be5aece185abc643b Apr 16 22:16:41.928575 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:41.928539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgnzk" event={"ID":"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079","Type":"ContainerStarted","Data":"589d9de66d1692031f0b22d6ef0050da0fdbed67d6e0728be5aece185abc643b"} Apr 16 22:16:42.932931 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:42.932883 2571 generic.go:358] "Generic (PLEG): container finished" podID="cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079" containerID="297054b4255ce6a2494b10fcb423454aaea09793f9024989114b6debef3929ec" exitCode=0 Apr 16 22:16:42.933320 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:42.932937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgnzk" event={"ID":"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079","Type":"ContainerDied","Data":"297054b4255ce6a2494b10fcb423454aaea09793f9024989114b6debef3929ec"} Apr 16 22:16:43.936965 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:43.936912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgnzk" event={"ID":"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079","Type":"ContainerStarted","Data":"b7236ec901b4bc16c930f40fde5221eb5f29a727896e3eda39f7dde9f3cd082e"} Apr 16 22:16:43.936965 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:43.936958 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tgnzk" event={"ID":"cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079","Type":"ContainerStarted","Data":"4fb8550b25b5b3fc8fa3411b56f3055db6dac8366f0c1be309a6c20605e86f22"} Apr 16 22:16:43.962013 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:43.961959 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tgnzk" podStartSLOduration=2.681112166 podStartE2EDuration="3.961940074s" podCreationTimestamp="2026-04-16 22:16:40 +0000 UTC" firstStartedPulling="2026-04-16 22:16:40.995537189 +0000 UTC m=+177.292245626" lastFinishedPulling="2026-04-16 22:16:42.276365081 +0000 UTC m=+178.573073534" observedRunningTime="2026-04-16 22:16:43.960645687 +0000 UTC m=+180.257354151" watchObservedRunningTime="2026-04-16 22:16:43.961940074 +0000 UTC m=+180.258648531" Apr 16 22:16:44.627926 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.627888 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5b46d89c44-vdw5b"] Apr 16 22:16:44.630910 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.630891 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.635484 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.635462 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:44.637248 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.637200 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 22:16:44.637321 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.637260 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 22:16:44.637456 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.637428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-48cl3n7nak43f\"" Apr 16 22:16:44.637456 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.637450 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-z8tsm\"" Apr 16 22:16:44.637620 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.637501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 22:16:44.661563 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.661530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b46d89c44-vdw5b"] Apr 16 22:16:44.676441 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-client-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676441 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-tls\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676644 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6728eef1-a5dc-4915-9aaf-67a89820cad1-audit-log\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676644 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-metrics-server-audit-profiles\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676644 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-client-certs\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.676737 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.676679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rr54\" (UniqueName: \"kubernetes.io/projected/6728eef1-a5dc-4915-9aaf-67a89820cad1-kube-api-access-9rr54\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777753 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-metrics-server-audit-profiles\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-client-certs\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rr54\" (UniqueName: \"kubernetes.io/projected/6728eef1-a5dc-4915-9aaf-67a89820cad1-kube-api-access-9rr54\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-client-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-tls\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.777942 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.777933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6728eef1-a5dc-4915-9aaf-67a89820cad1-audit-log\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.778376 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.778351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6728eef1-a5dc-4915-9aaf-67a89820cad1-audit-log\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.778580 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.778560 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.778932 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.778904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6728eef1-a5dc-4915-9aaf-67a89820cad1-metrics-server-audit-profiles\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.780398 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.780372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-client-certs\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.780523 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.780504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-client-ca-bundle\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.780565 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.780531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6728eef1-a5dc-4915-9aaf-67a89820cad1-secret-metrics-server-tls\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.809018 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.808991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rr54\" (UniqueName: \"kubernetes.io/projected/6728eef1-a5dc-4915-9aaf-67a89820cad1-kube-api-access-9rr54\") pod \"metrics-server-5b46d89c44-vdw5b\" (UID: \"6728eef1-a5dc-4915-9aaf-67a89820cad1\") " pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:44.939519 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:44.939430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:16:45.100515 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:45.100424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5b46d89c44-vdw5b"] Apr 16 22:16:45.103014 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:45.102988 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6728eef1_a5dc_4915_9aaf_67a89820cad1.slice/crio-3db5b39d55123194f3e5108773af279177aa513e72abb910489659d177dc40ac WatchSource:0}: Error finding container 3db5b39d55123194f3e5108773af279177aa513e72abb910489659d177dc40ac: Status 404 returned error can't find the container with id 3db5b39d55123194f3e5108773af279177aa513e72abb910489659d177dc40ac Apr 16 22:16:45.943157 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:45.943105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" event={"ID":"6728eef1-a5dc-4915-9aaf-67a89820cad1","Type":"ContainerStarted","Data":"3db5b39d55123194f3e5108773af279177aa513e72abb910489659d177dc40ac"} Apr 16 22:16:46.345610 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.345568 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:46.349076 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.349053 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.359447 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.359423 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:16:46.360150 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.360125 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:16:46.360284 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.360131 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:16:46.360284 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.360277 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:16:46.360503 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.360489 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:16:46.360555 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.360535 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:16:46.361483 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.361464 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:16:46.361750 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.361731 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5ackffefia5o6\"" Apr 16 22:16:46.361750 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.361748 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:16:46.361911 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.361733 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:16:46.367778 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.367741 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:16:46.370964 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.370947 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:16:46.380084 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.380061 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qk9xz\"" Apr 16 22:16:46.387919 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.387896 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:16:46.388761 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.388739 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:46.392874 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.392853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtfbz\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-kube-api-access-dtfbz\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.392967 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.392888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.392967 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.392952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393046 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.392986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393046 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config-out\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393046 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393158 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393158 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393158 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393158 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-web-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393286 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393382 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393382 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393382 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393469 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.393469 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.393427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494070 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-web-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494070 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494122 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494151 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtfbz\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-kube-api-access-dtfbz\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494345 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494784 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494784 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config-out\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494784 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494938 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494938 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494938 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.494938 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.494894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.495381 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.495755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.495905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.497729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.498371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.498438 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.498821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.498864 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.498832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.499683 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.499659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.499760 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.499727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.500236 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.500207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-web-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.500293 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.500266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.500350 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.500295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.500350 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.500270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.500726 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.500708 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config-out\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.501161 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.501142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-config\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.502193 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.502177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.507050 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.507012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtfbz\" (UniqueName: \"kubernetes.io/projected/a90b0a1e-899c-4489-aa75-6a0cb3de3d79-kube-api-access-dtfbz\") pod \"prometheus-k8s-0\" (UID: \"a90b0a1e-899c-4489-aa75-6a0cb3de3d79\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.658302 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.658265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:46.813057 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:16:46.813026 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90b0a1e_899c_4489_aa75_6a0cb3de3d79.slice/crio-310e5c5be4f145a7ce239dd1a178f0dcd65a93d2349aad8c5050f6ee3068e8a0 WatchSource:0}: Error finding container 310e5c5be4f145a7ce239dd1a178f0dcd65a93d2349aad8c5050f6ee3068e8a0: Status 404 returned error can't find the container with id 310e5c5be4f145a7ce239dd1a178f0dcd65a93d2349aad8c5050f6ee3068e8a0 Apr 16 22:16:46.818965 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.816524 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:46.947111 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.947012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"310e5c5be4f145a7ce239dd1a178f0dcd65a93d2349aad8c5050f6ee3068e8a0"} Apr 16 22:16:46.948351 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.948324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" event={"ID":"6728eef1-a5dc-4915-9aaf-67a89820cad1","Type":"ContainerStarted","Data":"e13690977b7f480e16a8203853b53fc9cf255284fcf2ef061a495e868768d7d8"} Apr 16 22:16:46.978560 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:46.978506 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" podStartSLOduration=1.822531547 podStartE2EDuration="2.978489246s" podCreationTimestamp="2026-04-16 22:16:44 +0000 UTC" firstStartedPulling="2026-04-16 22:16:45.104825463 +0000 UTC m=+181.401533899" lastFinishedPulling="2026-04-16 22:16:46.260783148 +0000 UTC m=+182.557491598" observedRunningTime="2026-04-16 22:16:46.976217192 +0000 UTC m=+183.272925649" watchObservedRunningTime="2026-04-16 22:16:46.978489246 +0000 UTC m=+183.275197705" Apr 16 22:16:47.890197 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:47.890167 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b5ff79494-smjn6" Apr 16 22:16:47.952854 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:47.952815 2571 generic.go:358] "Generic (PLEG): container finished" podID="a90b0a1e-899c-4489-aa75-6a0cb3de3d79" containerID="2669b93b879068cd4fa35bff2873754a70ecffb6446f751a773e78eb64fe57d1" exitCode=0 Apr 16 22:16:47.953257 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:47.952906 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerDied","Data":"2669b93b879068cd4fa35bff2873754a70ecffb6446f751a773e78eb64fe57d1"} Apr 16 22:16:50.967834 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:50.967798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"c322c5f065dbfa68d9dd0e284c25a5a1e9f9ffb8dbca571830422d9f793a60ee"} Apr 16 22:16:50.967834 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:50.967841 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"479f31e824254197aeaa3326d7c235304994d5e6d572ba1cec951e49559ec1b1"} Apr 16 22:16:52.976116 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:52.976075 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"8fdff7abc2de7b56dda27e4b95b993c0dac7fc271d7fc412e5f60a509bbf6c0c"} Apr 16 22:16:52.976116 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:52.976114 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"208d17a75f66b98bccad417566c7f8b0890b0a72e19cece36e2f53d0f7feeb64"} Apr 16 22:16:52.976116 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:52.976123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"a18ccf759dae38780815bf4b0e0bd40ec77d649fa0586c27dc04e6ea25443509"} Apr 16 22:16:52.976698 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:52.976131 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a90b0a1e-899c-4489-aa75-6a0cb3de3d79","Type":"ContainerStarted","Data":"970669741ad32be29c4e7ff5a078fe6fa951e3922539853d4fdc86ac9178c131"} Apr 16 22:16:53.011280 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:53.011224 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.842385114 podStartE2EDuration="7.011211903s" podCreationTimestamp="2026-04-16 22:16:46 +0000 UTC" firstStartedPulling="2026-04-16 22:16:46.820227581 +0000 UTC m=+183.116936031" lastFinishedPulling="2026-04-16 22:16:51.989054385 +0000 UTC m=+188.285762820" observedRunningTime="2026-04-16 22:16:53.010032973 +0000 UTC m=+189.306741430" watchObservedRunningTime="2026-04-16 22:16:53.011211903 +0000 UTC m=+189.307920361" Apr 16 22:16:56.658978 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:16:56.658934 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:04.939743 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:04.939681 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:17:04.939743 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:04.939747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:17:24.945191 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:24.945158 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:17:24.949215 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:24.949188 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5b46d89c44-vdw5b" Apr 16 22:17:46.659260 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:46.659220 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:46.679471 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:46.679439 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:47.143964 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:47.143930 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:55.163347 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:55.163303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:17:55.165629 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:55.165607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb950f1-309c-40b0-a9b2-13d46c7e2d60-metrics-certs\") pod \"network-metrics-daemon-89m59\" (UID: \"3eb950f1-309c-40b0-a9b2-13d46c7e2d60\") " pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:17:55.171172 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:55.171147 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gj5qm\"" Apr 16 22:17:55.179290 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:55.179271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-89m59" Apr 16 22:17:55.296611 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:55.296575 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-89m59"] Apr 16 22:17:55.299452 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:17:55.299422 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb950f1_309c_40b0_a9b2_13d46c7e2d60.slice/crio-3434fb73bef92917379120be7c71efb38f10a80fb4ddbe916d09dc3fa04fd819 WatchSource:0}: Error finding container 3434fb73bef92917379120be7c71efb38f10a80fb4ddbe916d09dc3fa04fd819: Status 404 returned error can't find the container with id 3434fb73bef92917379120be7c71efb38f10a80fb4ddbe916d09dc3fa04fd819 Apr 16 22:17:56.152489 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:56.152446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-89m59" event={"ID":"3eb950f1-309c-40b0-a9b2-13d46c7e2d60","Type":"ContainerStarted","Data":"3434fb73bef92917379120be7c71efb38f10a80fb4ddbe916d09dc3fa04fd819"} Apr 16 22:17:57.156917 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:57.156882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-89m59" event={"ID":"3eb950f1-309c-40b0-a9b2-13d46c7e2d60","Type":"ContainerStarted","Data":"805e019d6c91c2899b97b7df830f95ccefc621490b2a85f5625f410e1f2f8559"} Apr 16 22:17:57.156917 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:57.156921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-89m59" event={"ID":"3eb950f1-309c-40b0-a9b2-13d46c7e2d60","Type":"ContainerStarted","Data":"a7262bf37a988c987bc92bde74f8e2cce15ad397142847ea7d9494a4339761f7"} Apr 16 22:17:57.174598 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:17:57.174544 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-89m59" podStartSLOduration=252.282309807 podStartE2EDuration="4m13.17452741s" podCreationTimestamp="2026-04-16 22:13:44 +0000 UTC" firstStartedPulling="2026-04-16 22:17:55.301264742 +0000 UTC m=+251.597973179" lastFinishedPulling="2026-04-16 22:17:56.193482343 +0000 UTC m=+252.490190782" observedRunningTime="2026-04-16 22:17:57.172356102 +0000 UTC m=+253.469064561" watchObservedRunningTime="2026-04-16 22:17:57.17452741 +0000 UTC m=+253.471235867" Apr 16 22:18:39.101931 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.101888 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8zlsk"] Apr 16 22:18:39.105233 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.105206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.107912 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.107876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:18:39.113660 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.113632 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8zlsk"] Apr 16 22:18:39.212080 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.212036 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-dbus\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.212263 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.212101 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-kubelet-config\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.212263 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.212137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88581cf9-8412-4d1c-a15e-e1f10d312dbe-original-pull-secret\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.313251 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.313216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-dbus\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.313435 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.313273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-kubelet-config\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.313435 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.313317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88581cf9-8412-4d1c-a15e-e1f10d312dbe-original-pull-secret\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.313435 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.313413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-kubelet-config\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.313596 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.313413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88581cf9-8412-4d1c-a15e-e1f10d312dbe-dbus\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.315695 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.315672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88581cf9-8412-4d1c-a15e-e1f10d312dbe-original-pull-secret\") pod \"global-pull-secret-syncer-8zlsk\" (UID: \"88581cf9-8412-4d1c-a15e-e1f10d312dbe\") " pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.414812 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.414684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8zlsk" Apr 16 22:18:39.530931 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:39.530899 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8zlsk"] Apr 16 22:18:39.534526 ip-10-0-130-16 kubenswrapper[2571]: W0416 22:18:39.534480 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88581cf9_8412_4d1c_a15e_e1f10d312dbe.slice/crio-d22322d0ea7d14da70fc50b9e887ff422487152f6c84cae4e49006d1c8d91f38 WatchSource:0}: Error finding container d22322d0ea7d14da70fc50b9e887ff422487152f6c84cae4e49006d1c8d91f38: Status 404 returned error can't find the container with id d22322d0ea7d14da70fc50b9e887ff422487152f6c84cae4e49006d1c8d91f38 Apr 16 22:18:40.274678 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:40.274636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8zlsk" event={"ID":"88581cf9-8412-4d1c-a15e-e1f10d312dbe","Type":"ContainerStarted","Data":"d22322d0ea7d14da70fc50b9e887ff422487152f6c84cae4e49006d1c8d91f38"} Apr 16 22:18:44.264943 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:44.264914 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:44.287742 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:44.287703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8zlsk" event={"ID":"88581cf9-8412-4d1c-a15e-e1f10d312dbe","Type":"ContainerStarted","Data":"4b2adea4a5ba7762291ca542014f5f4335ca72741769c716af083a54f711a2a3"} Apr 16 22:18:44.303601 ip-10-0-130-16 kubenswrapper[2571]: I0416 22:18:44.303546 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8zlsk" podStartSLOduration=1.330044467 podStartE2EDuration="5.303530671s" podCreationTimestamp="2026-04-16 22:18:39 +0000 UTC" firstStartedPulling="2026-04-16 22:18:39.536185012 +0000 UTC m=+295.832893464" lastFinishedPulling="2026-04-16 22:18:43.50967123 +0000 UTC m=+299.806379668" observedRunningTime="2026-04-16 22:18:44.302175471 +0000 UTC m=+300.598883930" watchObservedRunningTime="2026-04-16 22:18:44.303530671 +0000 UTC m=+300.600239129" Apr 16 23:19:15.040714 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:15.040678 2571 ???:1] "http: TLS handshake error from 10.0.130.16:45818: EOF" Apr 16 23:19:15.047303 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:15.047276 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8zlsk_88581cf9-8412-4d1c-a15e-e1f10d312dbe/global-pull-secret-syncer/0.log" Apr 16 23:19:15.255867 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:15.255832 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r6qrd_d31d911a-3d25-4b6f-ae92-8ec66384bbbf/konnectivity-agent/0.log" Apr 16 23:19:15.275488 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:15.275462 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-16.ec2.internal_7d17133b5a635b239940c82faeb49a51/haproxy/0.log" Apr 16 23:19:19.391194 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.391161 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5b46d89c44-vdw5b_6728eef1-a5dc-4915-9aaf-67a89820cad1/metrics-server/0.log" Apr 16 23:19:19.634863 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.634831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgnzk_cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079/node-exporter/0.log" Apr 16 23:19:19.657513 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.657488 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgnzk_cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079/kube-rbac-proxy/0.log" Apr 16 23:19:19.677173 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.677143 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tgnzk_cb7d7efd-7ddb-49c8-9e32-c6f52cd9e079/init-textfile/0.log" Apr 16 23:19:19.841827 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.841791 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/prometheus/0.log" Apr 16 23:19:19.863517 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.863492 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/config-reloader/0.log" Apr 16 23:19:19.887988 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.887912 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/thanos-sidecar/0.log" Apr 16 23:19:19.914216 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.914187 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/kube-rbac-proxy-web/0.log" Apr 16 23:19:19.941223 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.941197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/kube-rbac-proxy/0.log" Apr 16 23:19:19.963495 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.963475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/kube-rbac-proxy-thanos/0.log" Apr 16 23:19:19.986830 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:19.986805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a90b0a1e-899c-4489-aa75-6a0cb3de3d79/init-config-reloader/0.log" Apr 16 23:19:21.943406 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.943372 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w"] Apr 16 23:19:21.946132 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.946112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:21.949410 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.949383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"kube-root-ca.crt\"" Apr 16 23:19:21.949523 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.949416 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8ccpf\"/\"default-dockercfg-hkh9t\"" Apr 16 23:19:21.949523 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.949383 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8ccpf\"/\"openshift-service-ca.crt\"" Apr 16 23:19:21.956476 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:21.956451 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w"] Apr 16 23:19:22.005231 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.005188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-podres\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.005231 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.005233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-lib-modules\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.005447 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.005253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-sys\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.005447 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.005296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtj6m\" (UniqueName: \"kubernetes.io/projected/69e12194-b2d9-4203-875e-be81da80bb0f-kube-api-access-mtj6m\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.005447 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.005395 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-proc\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106177 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-proc\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-podres\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-lib-modules\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-sys\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-proc\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106279 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtj6m\" (UniqueName: \"kubernetes.io/projected/69e12194-b2d9-4203-875e-be81da80bb0f-kube-api-access-mtj6m\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-podres\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106361 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-lib-modules\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.106599 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.106362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69e12194-b2d9-4203-875e-be81da80bb0f-sys\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.114439 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.114401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtj6m\" (UniqueName: \"kubernetes.io/projected/69e12194-b2d9-4203-875e-be81da80bb0f-kube-api-access-mtj6m\") pod \"perf-node-gather-daemonset-ht69w\" (UID: \"69e12194-b2d9-4203-875e-be81da80bb0f\") " pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.256914 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.256813 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:22.377866 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.377836 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w"] Apr 16 23:19:22.380921 ip-10-0-130-16 kubenswrapper[2571]: W0416 23:19:22.380877 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod69e12194_b2d9_4203_875e_be81da80bb0f.slice/crio-da39b6f28cf7cde2e93bd806f0310e0f615bb2695ea5e57a2989cd75b7621ce4 WatchSource:0}: Error finding container da39b6f28cf7cde2e93bd806f0310e0f615bb2695ea5e57a2989cd75b7621ce4: Status 404 returned error can't find the container with id da39b6f28cf7cde2e93bd806f0310e0f615bb2695ea5e57a2989cd75b7621ce4 Apr 16 23:19:22.382565 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:22.382549 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:19:23.025000 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.024962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" event={"ID":"69e12194-b2d9-4203-875e-be81da80bb0f","Type":"ContainerStarted","Data":"256016fa493b754fa0a5eddd98897a4d34831f5b03dc68c498637b3edc8f1275"} Apr 16 23:19:23.025000 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.025002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" event={"ID":"69e12194-b2d9-4203-875e-be81da80bb0f","Type":"ContainerStarted","Data":"da39b6f28cf7cde2e93bd806f0310e0f615bb2695ea5e57a2989cd75b7621ce4"} Apr 16 23:19:23.025442 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.025109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:23.028043 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.028019 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7tmv5_e30781c3-7044-4751-961b-48bf5ce84af8/dns/0.log" Apr 16 23:19:23.040630 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.040588 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" podStartSLOduration=2.04057266 podStartE2EDuration="2.04057266s" podCreationTimestamp="2026-04-16 23:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:19:23.039593739 +0000 UTC m=+3939.336302196" watchObservedRunningTime="2026-04-16 23:19:23.04057266 +0000 UTC m=+3939.337281163" Apr 16 23:19:23.053490 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.053468 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7tmv5_e30781c3-7044-4751-961b-48bf5ce84af8/kube-rbac-proxy/0.log" Apr 16 23:19:23.212782 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.212748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nwdxm_43ef0782-3c86-49cc-a384-b383bf10c750/dns-node-resolver/0.log" Apr 16 23:19:23.590447 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.590413 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6b5ff79494-smjn6_2392c624-be45-4127-b416-d45610367c07/registry/0.log" Apr 16 23:19:23.633306 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:23.633280 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kjs6v_b7a91f2b-d56f-4cd1-86e6-533c19173cab/node-ca/0.log" Apr 16 23:19:24.674746 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:24.674718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m4s6c_e1c9f736-046e-47ad-a262-894ae7c4f9b6/serve-healthcheck-canary/0.log" Apr 16 23:19:25.096993 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:25.096891 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9crn_0830df32-674c-4f25-b525-e9b82449155d/kube-rbac-proxy/0.log" Apr 16 23:19:25.115925 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:25.115897 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9crn_0830df32-674c-4f25-b525-e9b82449155d/exporter/0.log" Apr 16 23:19:25.135477 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:25.135447 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t9crn_0830df32-674c-4f25-b525-e9b82449155d/extractor/0.log" Apr 16 23:19:29.037080 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:29.037049 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8ccpf/perf-node-gather-daemonset-ht69w" Apr 16 23:19:32.627277 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.627244 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/kube-multus-additional-cni-plugins/0.log" Apr 16 23:19:32.646311 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.646287 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/egress-router-binary-copy/0.log" Apr 16 23:19:32.666559 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.666483 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/cni-plugins/0.log" Apr 16 23:19:32.687053 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.687027 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/bond-cni-plugin/0.log" Apr 16 23:19:32.706819 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.706789 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/routeoverride-cni/0.log" Apr 16 23:19:32.728221 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.728193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/whereabouts-cni-bincopy/0.log" Apr 16 23:19:32.750326 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:32.750301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jdp2m_41a45a8e-91e8-4a52-90bd-9fddbbac91d0/whereabouts-cni/0.log" Apr 16 23:19:33.119381 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:33.119354 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q92dp_efe2768a-1ae2-4e05-9621-e7a2cb669a2f/kube-multus/0.log" Apr 16 23:19:33.229603 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:33.229574 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-89m59_3eb950f1-309c-40b0-a9b2-13d46c7e2d60/network-metrics-daemon/0.log" Apr 16 23:19:33.252088 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:33.252010 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-89m59_3eb950f1-309c-40b0-a9b2-13d46c7e2d60/kube-rbac-proxy/0.log" Apr 16 23:19:33.989337 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:33.989312 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/ovn-controller/0.log" Apr 16 23:19:34.023580 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.023545 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/ovn-acl-logging/0.log" Apr 16 23:19:34.042715 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.042669 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/kube-rbac-proxy-node/0.log" Apr 16 23:19:34.062445 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.062418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:19:34.078994 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.078961 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/northd/0.log" Apr 16 23:19:34.098371 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.098344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/nbdb/0.log" Apr 16 23:19:34.117451 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.117425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/sbdb/0.log" Apr 16 23:19:34.206192 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:34.206150 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-88hfq_a0f30e25-01b8-4134-8184-2a06ef526c62/ovnkube-controller/0.log" Apr 16 23:19:35.738278 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:35.738237 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gzcjz_8e076f04-d3ed-41ce-ade7-ecc7c991025a/network-check-target-container/0.log" Apr 16 23:19:36.555532 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:36.555500 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4vrs4_f4c633be-7a8f-41f5-ba11-b35591028a41/iptables-alerter/0.log" Apr 16 23:19:37.144841 ip-10-0-130-16 kubenswrapper[2571]: I0416 23:19:37.144813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6h8wj_47517e74-f8d0-404e-8053-08ab938c2d36/tuned/0.log"