Apr 17 09:08:24.216350 ip-10-0-143-18 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 09:08:24.216360 ip-10-0-143-18 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 09:08:24.216367 ip-10-0-143-18 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 09:08:24.216612 ip-10-0-143-18 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 09:08:34.427349 ip-10-0-143-18 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 09:08:34.427367 ip-10-0-143-18 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 020382c896df4947b4667b36496cb27e -- Apr 17 09:10:59.160831 ip-10-0-143-18 systemd[1]: Starting Kubernetes Kubelet... Apr 17 09:10:59.626916 ip-10-0-143-18 kubenswrapper[2606]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:10:59.626916 ip-10-0-143-18 kubenswrapper[2606]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 09:10:59.626916 ip-10-0-143-18 kubenswrapper[2606]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:10:59.626916 ip-10-0-143-18 kubenswrapper[2606]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 09:10:59.626916 ip-10-0-143-18 kubenswrapper[2606]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:10:59.628129 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.627353 2606 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 09:10:59.632769 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632744 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:10:59.632769 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632765 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:10:59.632769 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632771 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:10:59.632769 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632774 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:10:59.632769 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632778 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632781 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632785 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632788 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632791 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632794 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632797 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632800 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632803 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632805 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632808 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632811 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632815 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632817 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632820 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632824 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632826 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632829 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632832 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632836 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:10:59.632994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632839 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632842 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632850 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632853 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632856 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632859 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632861 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632864 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632866 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632869 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632872 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632875 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632877 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632880 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632884 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632887 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632890 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632893 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632895 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632898 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:10:59.633469 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632900 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632903 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632906 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632908 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632911 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632913 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632918 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632921 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632924 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632926 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632929 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632932 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632934 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632937 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632940 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632942 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632945 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632947 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632950 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:10:59.633959 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632952 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632955 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632957 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632960 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632962 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632965 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632967 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632970 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632974 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632976 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632979 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632982 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632985 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632987 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632990 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632993 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632996 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.632999 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633002 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:10:59.634415 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633005 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633008 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633010 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633013 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633483 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633503 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633507 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633510 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633513 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633516 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633519 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633522 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633525 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633528 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633531 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633533 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633536 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633539 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633542 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633544 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:10:59.634879 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633548 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633551 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633553 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633556 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633558 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633561 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633564 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633566 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633569 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633572 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633575 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633578 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633581 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633583 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633586 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633588 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633591 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633593 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633596 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:10:59.635341 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633598 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633601 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633603 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633606 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633608 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633611 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633613 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633616 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633618 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633621 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633623 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633626 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633628 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633632 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633635 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633638 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633640 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633643 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633646 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633648 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:10:59.635826 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633651 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633653 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633656 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633659 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633661 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633665 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633669 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633671 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633674 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633677 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633679 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633682 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633685 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633688 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633691 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633693 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633697 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633701 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633704 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:10:59.636319 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633707 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633710 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633713 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633716 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633718 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633721 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633724 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633727 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633730 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633732 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633735 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.633737 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633812 2606 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633820 2606 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633827 2606 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633831 2606 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633837 2606 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633840 2606 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633845 2606 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633850 2606 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633853 2606 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 09:10:59.636816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633856 2606 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633860 2606 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633864 2606 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633867 2606 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633871 2606 flags.go:64] FLAG: --cgroup-root="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633874 2606 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633877 2606 flags.go:64] FLAG: --client-ca-file="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633880 2606 flags.go:64] FLAG: --cloud-config="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633882 2606 flags.go:64] FLAG: --cloud-provider="external" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633885 2606 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633889 2606 flags.go:64] FLAG: --cluster-domain="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633893 2606 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633896 2606 flags.go:64] FLAG: --config-dir="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633899 2606 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633903 2606 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633907 2606 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633910 2606 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633913 2606 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633916 2606 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633919 2606 flags.go:64] FLAG: --contention-profiling="false" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633923 2606 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633926 2606 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633930 2606 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633933 2606 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633937 2606 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 09:10:59.637334 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633940 2606 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633943 2606 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633946 2606 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633950 2606 flags.go:64] FLAG: --enable-server="true" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633953 2606 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633958 2606 flags.go:64] FLAG: --event-burst="100" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633961 2606 flags.go:64] FLAG: --event-qps="50" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633964 2606 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633967 2606 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633971 2606 flags.go:64] FLAG: --eviction-hard="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633974 2606 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633978 2606 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633981 2606 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633984 2606 flags.go:64] FLAG: --eviction-soft="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633987 2606 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633990 2606 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633994 2606 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633996 2606 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.633999 2606 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634002 2606 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634005 2606 flags.go:64] FLAG: --feature-gates="" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634009 2606 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634012 2606 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634016 2606 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634019 2606 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634022 2606 flags.go:64] FLAG: --healthz-port="10248" Apr 17 09:10:59.637968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634026 2606 flags.go:64] FLAG: --help="false" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634031 2606 flags.go:64] FLAG: --hostname-override="ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634034 2606 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634038 2606 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634041 2606 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634044 2606 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634047 2606 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634051 2606 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634054 2606 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634056 2606 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634060 2606 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634063 2606 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634066 2606 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634069 2606 flags.go:64] FLAG: --kube-reserved="" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634072 2606 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634075 2606 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634078 2606 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634081 2606 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634085 2606 flags.go:64] FLAG: --lock-file="" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634088 2606 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634091 2606 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634094 2606 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634099 2606 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 09:10:59.638598 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634102 2606 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634105 2606 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634108 2606 flags.go:64] FLAG: --logging-format="text" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634111 2606 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634114 2606 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634117 2606 flags.go:64] FLAG: --manifest-url="" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634120 2606 flags.go:64] FLAG: --manifest-url-header="" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634124 2606 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634127 2606 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634131 2606 flags.go:64] FLAG: --max-pods="110" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634139 2606 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634142 2606 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634145 2606 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634148 2606 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634151 2606 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634153 2606 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634156 2606 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634164 2606 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634167 2606 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634171 2606 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634174 2606 flags.go:64] FLAG: --pod-cidr="" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634177 2606 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634183 2606 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634190 2606 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 09:10:59.639160 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634193 2606 flags.go:64] FLAG: --pods-per-core="0" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634196 2606 flags.go:64] FLAG: --port="10250" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634199 2606 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634202 2606 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-041c02f74eb64db95" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634205 2606 flags.go:64] FLAG: --qos-reserved="" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634208 2606 flags.go:64] FLAG: --read-only-port="10255" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634212 2606 flags.go:64] FLAG: --register-node="true" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634215 2606 flags.go:64] FLAG: --register-schedulable="true" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634218 2606 flags.go:64] FLAG: --register-with-taints="" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634222 2606 flags.go:64] FLAG: --registry-burst="10" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634225 2606 flags.go:64] FLAG: --registry-qps="5" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634227 2606 flags.go:64] FLAG: --reserved-cpus="" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634230 2606 flags.go:64] FLAG: --reserved-memory="" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634234 2606 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634237 2606 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634240 2606 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634243 2606 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634246 2606 flags.go:64] FLAG: --runonce="false" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634250 2606 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634253 2606 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634256 2606 flags.go:64] FLAG: --seccomp-default="false" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634259 2606 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634262 2606 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634265 2606 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634268 2606 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634271 2606 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 09:10:59.639780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634274 2606 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634276 2606 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634279 2606 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634283 2606 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634286 2606 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634290 2606 flags.go:64] FLAG: --system-cgroups="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634293 2606 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634298 2606 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634301 2606 flags.go:64] FLAG: --tls-cert-file="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634304 2606 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634309 2606 flags.go:64] FLAG: --tls-min-version="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634311 2606 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634314 2606 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634317 2606 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634320 2606 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634323 2606 flags.go:64] FLAG: --v="2" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634327 2606 flags.go:64] FLAG: --version="false" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634331 2606 flags.go:64] FLAG: --vmodule="" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634336 2606 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634339 2606 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634453 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634457 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634460 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:10:59.640678 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634464 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634469 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634473 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634476 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634479 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634482 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634485 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634501 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634505 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634508 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634511 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634514 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634517 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634520 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634525 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634528 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634531 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634534 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634537 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634541 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:10:59.641702 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634544 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634547 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634549 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634552 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634554 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634557 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634560 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634562 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634565 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634567 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634570 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634573 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634575 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634580 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634583 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634585 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634588 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634591 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634593 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634596 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:10:59.642591 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634599 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634601 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634604 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634607 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634609 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634612 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634617 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634619 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634622 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634624 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634627 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634630 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634632 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634635 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634637 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634640 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634643 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634645 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634648 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634650 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:10:59.643471 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634653 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634656 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634658 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634661 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634663 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634667 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634670 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634672 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634675 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634678 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634681 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634684 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634686 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634689 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634691 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634694 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634697 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634700 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634703 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:10:59.644116 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634708 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634711 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634714 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.634716 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.634722 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.642302 2606 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.642327 2606 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642404 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642413 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642418 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642424 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642429 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642433 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642438 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642445 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:10:59.644898 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642453 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642459 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642464 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642469 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642473 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642478 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642486 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642508 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642513 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642517 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642522 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642527 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642532 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642537 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642541 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642545 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642550 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642555 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642560 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642564 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:10:59.645576 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642568 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642573 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642577 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642584 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642588 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642592 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642596 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642600 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642604 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642608 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642612 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642616 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642620 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642625 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642629 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642634 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642640 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642646 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642652 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642656 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:10:59.646137 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642660 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642664 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642669 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642673 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642679 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642683 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642688 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642692 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642697 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642701 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642705 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642709 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642713 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642718 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642722 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642727 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642737 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642742 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642746 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642750 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:10:59.646906 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642754 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642758 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642763 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642767 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642772 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642776 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642780 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642784 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642789 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642793 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642797 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642801 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642806 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642810 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642815 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642819 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642823 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:10:59.647445 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642828 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.642836 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.642999 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643010 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643015 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643019 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643024 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643028 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643033 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643037 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643042 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643047 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643052 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643057 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643061 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643066 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:10:59.647901 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643070 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643075 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643081 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643087 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643092 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643098 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643102 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643107 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643111 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643116 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643121 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643126 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643130 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643134 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643139 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643144 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643148 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643153 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643157 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:10:59.648295 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643162 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643166 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643170 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643175 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643180 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643184 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643189 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643194 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643200 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643206 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643212 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643217 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643221 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643225 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643230 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643234 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643238 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643242 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643247 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643251 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:10:59.648825 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643256 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643260 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643264 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643268 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643273 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643277 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643281 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643285 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643289 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643293 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643297 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643301 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643305 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643309 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643314 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643318 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643322 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643326 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643331 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643335 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:10:59.649342 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643340 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643344 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643349 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643354 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643358 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643363 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643367 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643371 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643375 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643379 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643384 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643388 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:10:59.643392 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.643400 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:10:59.649857 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.643609 2606 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 09:10:59.650221 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.646291 2606 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 09:10:59.650221 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.647325 2606 server.go:1019] "Starting client certificate rotation" Apr 17 09:10:59.650221 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.647423 2606 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:10:59.650221 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.647466 2606 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:10:59.671353 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.671320 2606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:10:59.673875 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.673845 2606 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:10:59.690935 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.690908 2606 log.go:25] "Validated CRI v1 runtime API" Apr 17 09:10:59.697777 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.697747 2606 log.go:25] "Validated CRI v1 image API" Apr 17 09:10:59.698933 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.698915 2606 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 09:10:59.702604 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.702397 2606 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b6a3b514-0af2-4096-9d5c-fd2dd2e63e62:/dev/nvme0n1p4 e48a3a19-c530-45e7-86aa-396860af9837:/dev/nvme0n1p3] Apr 17 09:10:59.702604 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.702594 2606 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:10:59.702747 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.702602 2606 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 09:10:59.708988 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.708866 2606 manager.go:217] Machine: {Timestamp:2026-04-17 09:10:59.706827096 +0000 UTC m=+0.426325148 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113308 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec230cbc9e4468d2365fce95a7176a6d SystemUUID:ec230cbc-9e44-68d2-365f-ce95a7176a6d BootID:020382c8-96df-4947-b466-7b36496cb27e Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2b:cb:66:e0:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2b:cb:66:e0:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:ca:84:c2:63:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 09:10:59.708988 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.708981 2606 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 09:10:59.709098 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.709073 2606 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 09:10:59.710866 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.710844 2606 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 09:10:59.711013 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.710868 2606 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-18.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 09:10:59.711061 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.711024 2606 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 09:10:59.711061 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.711033 2606 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 09:10:59.711061 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.711047 2606 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:10:59.711824 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.711813 2606 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:10:59.713524 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.713513 2606 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:10:59.713647 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.713638 2606 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 09:10:59.716379 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.716367 2606 kubelet.go:491] "Attempting to sync node with API server" Apr 17 09:10:59.716430 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.716384 2606 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 09:10:59.716430 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.716397 2606 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 09:10:59.716430 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.716407 2606 kubelet.go:397] "Adding apiserver pod source" Apr 17 09:10:59.716430 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.716416 2606 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 09:10:59.717609 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.717593 2606 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:10:59.717702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.717614 2606 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:10:59.720602 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.720587 2606 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 09:10:59.721829 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.721817 2606 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 09:10:59.723977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.723961 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 09:10:59.723977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.723979 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.723986 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.723992 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.723997 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724003 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724009 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724016 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724023 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724029 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724038 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 09:10:59.724065 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724047 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 09:10:59.724957 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724947 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 09:10:59.725000 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.724958 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 09:10:59.728682 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.728666 2606 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 09:10:59.728765 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.728716 2606 server.go:1295] "Started kubelet" Apr 17 09:10:59.728823 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.728777 2606 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 09:10:59.728894 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.728840 2606 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 09:10:59.728944 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.728913 2606 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 09:10:59.729657 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.729634 2606 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-18.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 09:10:59.729791 ip-10-0-143-18 systemd[1]: Started Kubernetes Kubelet. Apr 17 09:10:59.729904 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.729831 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-18.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 09:10:59.729959 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.729940 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 09:10:59.729991 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.729960 2606 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 09:10:59.730900 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.730779 2606 server.go:317] "Adding debug handlers to kubelet server" Apr 17 09:10:59.736305 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.736283 2606 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 09:10:59.737046 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.737027 2606 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 09:10:59.737894 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.737871 2606 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 09:10:59.737894 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.737890 2606 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 09:10:59.738043 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.737901 2606 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 09:10:59.738158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738142 2606 reconstruct.go:97] "Volume reconstruction finished" Apr 17 09:10:59.738208 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738159 2606 reconciler.go:26] "Reconciler: start to sync state" Apr 17 09:10:59.738929 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738671 2606 factory.go:55] Registering systemd factory Apr 17 09:10:59.738929 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738700 2606 factory.go:223] Registration of the systemd container factory successfully Apr 17 09:10:59.739224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738974 2606 factory.go:153] Registering CRI-O factory Apr 17 09:10:59.739224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.738990 2606 factory.go:223] Registration of the crio container factory successfully Apr 17 09:10:59.739224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.739040 2606 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 09:10:59.739224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.739068 2606 factory.go:103] Registering Raw factory Apr 17 09:10:59.739224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.739093 2606 manager.go:1196] Started watching for new ooms in manager Apr 17 09:10:59.739647 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.739558 2606 manager.go:319] Starting recovery of all containers Apr 17 09:10:59.739915 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.735960 2606 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-18.ec2.internal.18a719e76a126308 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-18.ec2.internal,UID:ip-10-0-143-18.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-18.ec2.internal,},FirstTimestamp:2026-04-17 09:10:59.728679688 +0000 UTC m=+0.448177738,LastTimestamp:2026-04-17 09:10:59.728679688 +0000 UTC m=+0.448177738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-18.ec2.internal,}" Apr 17 09:10:59.740157 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.740128 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:10:59.741701 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.741676 2606 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 09:10:59.741988 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.741969 2606 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vz8d8" Apr 17 09:10:59.745712 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.745517 2606 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-18.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 09:10:59.745712 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.745532 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 09:10:59.750002 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.749978 2606 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vz8d8" Apr 17 09:10:59.750289 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.750274 2606 manager.go:324] Recovery completed Apr 17 09:10:59.754539 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.754525 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.757263 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757245 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.757326 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757277 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.757326 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757289 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.757815 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757802 2606 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 09:10:59.757815 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757814 2606 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 09:10:59.757897 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.757847 2606 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:10:59.759647 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.759577 2606 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-18.ec2.internal.18a719e76bc683df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-18.ec2.internal,UID:ip-10-0-143-18.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-18.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-18.ec2.internal,},FirstTimestamp:2026-04-17 09:10:59.757261791 +0000 UTC m=+0.476759838,LastTimestamp:2026-04-17 09:10:59.757261791 +0000 UTC m=+0.476759838,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-18.ec2.internal,}" Apr 17 09:10:59.759821 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.759808 2606 policy_none.go:49] "None policy: Start" Apr 17 09:10:59.759867 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.759836 2606 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 09:10:59.759867 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.759856 2606 state_mem.go:35] "Initializing new in-memory state store" Apr 17 09:10:59.794123 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794103 2606 manager.go:341] "Starting Device Plugin manager" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.794151 2606 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794166 2606 server.go:85] "Starting device plugin registration server" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794485 2606 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794516 2606 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794656 2606 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794736 2606 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.794744 2606 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.795217 2606 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 09:10:59.805121 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.795253 2606 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:10:59.864128 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.864093 2606 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 09:10:59.865287 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.865261 2606 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 09:10:59.865287 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.865286 2606 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 09:10:59.865481 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.865308 2606 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 09:10:59.865481 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.865320 2606 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 09:10:59.865481 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.865354 2606 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 09:10:59.867451 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.867427 2606 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:10:59.895204 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.895113 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.896288 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.896271 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.896385 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.896301 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.896385 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.896316 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.896385 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.896344 2606 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.904522 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.904504 2606 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.904571 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.904531 2606 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-18.ec2.internal\": node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:10:59.920363 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.920330 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:10:59.965980 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.965927 2606 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal"] Apr 17 09:10:59.966133 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.966032 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.967242 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.967227 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.967305 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.967258 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.967305 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.967269 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.968583 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.968570 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.968737 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.968722 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.968779 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.968754 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.970729 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970706 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.970833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970735 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.970833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970708 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.970833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970781 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.970833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970750 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.970833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.970791 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.972031 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.972015 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.972085 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.972048 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:10:59.972896 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.972878 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:10:59.972987 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.972905 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:10:59.972987 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:10:59.972919 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:10:59.988124 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.988101 2606 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-18.ec2.internal\" not found" node="ip-10-0-143-18.ec2.internal" Apr 17 09:10:59.991562 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:10:59.991543 2606 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-18.ec2.internal\" not found" node="ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.021380 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.021341 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.039976 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.039951 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/955792ba1322848ad185260c969a8a99-config\") pod \"kube-apiserver-proxy-ip-10-0-143-18.ec2.internal\" (UID: \"955792ba1322848ad185260c969a8a99\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.040087 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.039980 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.040087 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.040000 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.121881 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.121843 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.140323 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140292 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.140454 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140343 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.140454 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140365 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/955792ba1322848ad185260c969a8a99-config\") pod \"kube-apiserver-proxy-ip-10-0-143-18.ec2.internal\" (UID: \"955792ba1322848ad185260c969a8a99\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.140454 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140300 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.140454 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140400 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/955792ba1322848ad185260c969a8a99-config\") pod \"kube-apiserver-proxy-ip-10-0-143-18.ec2.internal\" (UID: \"955792ba1322848ad185260c969a8a99\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.140454 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.140433 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37152d2cd2910a21c6149c8f6253015d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal\" (UID: \"37152d2cd2910a21c6149c8f6253015d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.222741 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.222666 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.290267 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.290233 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.293881 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.293864 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:11:00.323667 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.323634 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.424253 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.424197 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.524686 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.524658 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.625407 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.625354 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.646889 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.646851 2606 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 09:11:00.647553 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.647002 2606 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:00.725509 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.725431 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.737809 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.737775 2606 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 09:11:00.747432 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.747408 2606 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:11:00.752309 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.752275 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 09:05:59 +0000 UTC" deadline="2027-10-20 14:03:55.079995517 +0000 UTC" Apr 17 09:11:00.752309 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.752304 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13228h52m54.327695324s" Apr 17 09:11:00.770769 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.770697 2606 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pvdl5" Apr 17 09:11:00.777025 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.776998 2606 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pvdl5" Apr 17 09:11:00.826613 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.826584 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:00.876795 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:00.876762 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37152d2cd2910a21c6149c8f6253015d.slice/crio-de0a60cb6f31715609bd830ebeb165a8f2f9f1c983f23d0a4ecb9dc6b1c2e440 WatchSource:0}: Error finding container de0a60cb6f31715609bd830ebeb165a8f2f9f1c983f23d0a4ecb9dc6b1c2e440: Status 404 returned error can't find the container with id de0a60cb6f31715609bd830ebeb165a8f2f9f1c983f23d0a4ecb9dc6b1c2e440 Apr 17 09:11:00.877413 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:00.877388 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955792ba1322848ad185260c969a8a99.slice/crio-69ee7dd921011a766458d6ce0ff2ce479bcf8b12e3b75e6bc9397fa8ff48e0a5 WatchSource:0}: Error finding container 69ee7dd921011a766458d6ce0ff2ce479bcf8b12e3b75e6bc9397fa8ff48e0a5: Status 404 returned error can't find the container with id 69ee7dd921011a766458d6ce0ff2ce479bcf8b12e3b75e6bc9397fa8ff48e0a5 Apr 17 09:11:00.881669 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.881652 2606 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:11:00.887559 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:00.887537 2606 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:00.927485 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:00.927432 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:01.028054 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.027967 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-18.ec2.internal\" not found" Apr 17 09:11:01.074381 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.074355 2606 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:01.138098 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.138068 2606 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" Apr 17 09:11:01.154351 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.154322 2606 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:01.155202 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.155185 2606 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" Apr 17 09:11:01.164622 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.164602 2606 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:01.301908 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.301825 2606 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:01.638587 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.638510 2606 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:01.718110 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.718075 2606 apiserver.go:52] "Watching apiserver" Apr 17 09:11:01.726020 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.725985 2606 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 09:11:01.727355 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.727325 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-dcjr8","openshift-multus/multus-additional-cni-plugins-2jnqg","openshift-network-operator/iptables-alerter-mnb85","openshift-ovn-kubernetes/ovnkube-node-pczcn","kube-system/konnectivity-agent-8ntkx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574","openshift-dns/node-resolver-wxmnk","openshift-image-registry/node-ca-g4rrd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal","openshift-multus/multus-d5sb9","openshift-multus/network-metrics-daemon-2w27v","openshift-network-diagnostics/network-check-target-7l5df","kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal"] Apr 17 09:11:01.728874 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.728759 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.729917 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.729892 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.731234 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.731065 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:01.731234 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.731140 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:01.731508 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.731462 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 09:11:01.731691 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.731670 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.731773 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.731694 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.731773 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.731755 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4298b\"" Apr 17 09:11:01.732367 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.732217 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.732367 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.732285 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.733367 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.733392 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.733746 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lhfqr\"" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.733870 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.734000 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 09:11:01.734286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.734082 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 09:11:01.736420 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.736053 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.738062 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.738042 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.739071 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.738935 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jh7bg\"" Apr 17 09:11:01.739071 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.739000 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.740328 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.740631 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741050 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741191 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vr8x8\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741240 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741286 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 09:11:01.741421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741371 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.741938 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.741837 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.743185 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.743165 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.743555 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.743304 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.744761 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.744741 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.745011 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.744990 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.745217 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745201 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wftd8\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745539 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745616 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745698 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wnmh2\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745547 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745752 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jksqb\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745539 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.745839 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.745835 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 09:11:01.746203 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.746004 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 09:11:01.746464 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.746447 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.747056 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.747038 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.747657 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.747638 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.747874 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.747855 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-p9r2n\"" Apr 17 09:11:01.748947 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.748772 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:01.748947 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.748917 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 09:11:01.749121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.749104 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nkzcx\"" Apr 17 09:11:01.749333 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.749317 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:01.749921 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.749903 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cni-binary-copy\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750096 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750080 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-etc-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.750232 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750217 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.750420 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750386 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-node-log\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.750515 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750427 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-socket-dir-parent\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750515 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750453 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-daemon-config\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750589 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750521 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-etc-kubernetes\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750589 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750546 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750650 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750603 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-netns\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750650 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750634 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-hostroot\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750705 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750660 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-device-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.750705 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750687 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.750773 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750711 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.750773 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750739 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-bin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.750855 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750823 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sqf5\" (UniqueName: \"kubernetes.io/projected/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-kube-api-access-4sqf5\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.750893 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750877 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-os-release\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.750949 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750905 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.751004 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750967 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-tmp-dir\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.751004 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.750991 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-cnibin\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.751099 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751013 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-log-socket\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751099 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751037 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovn-node-metrics-cert\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751099 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751060 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-system-cni-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.751240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751101 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-systemd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751124 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzn5\" (UniqueName: \"kubernetes.io/projected/fdf1115b-531b-4095-86b7-8ac9a436dd2f-kube-api-access-qkzn5\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751148 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.751240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751174 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.751240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751214 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-multus-certs\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751259 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qbhr\" (UniqueName: \"kubernetes.io/projected/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kube-api-access-7qbhr\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751294 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lkb\" (UniqueName: \"kubernetes.io/projected/07361f70-d7d2-4866-8e18-3bd88e02229e-kube-api-access-r6lkb\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751321 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-slash\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751359 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-netns\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751394 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-config\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751425 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-env-overrides\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751450 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.751478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751475 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-script-lib\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751557 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hng\" (UniqueName: \"kubernetes.io/projected/fd082fd1-df83-4c85-ba4b-b7b20f551f67-kube-api-access-27hng\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751594 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cnibin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751640 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-k8s-cni-cncf-io\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751664 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-multus\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751686 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-kubelet\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751711 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-sys-fs\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751736 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751759 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-ovn\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751782 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751808 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-system-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.751863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751856 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-os-release\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751878 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-systemd-units\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751902 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-bin\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751928 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-hosts-file\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751949 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-netd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.751973 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752000 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-registration-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752024 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-socket-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752048 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-var-lib-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752072 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7sd\" (UniqueName: \"kubernetes.io/projected/ef0a40f6-04de-4672-9770-be487916c08b-kube-api-access-lz7sd\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752104 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-conf-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752136 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-kubelet\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.752329 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.752173 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.777679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.777642 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:00 +0000 UTC" deadline="2028-01-09 09:40:15.454553004 +0000 UTC" Apr 17 09:11:01.777679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.777677 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15168h29m13.676880501s" Apr 17 09:11:01.838907 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.838871 2606 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 09:11:01.853103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.852966 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853019 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853053 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-run\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.853103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853077 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-sys\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.853103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853103 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-bin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853128 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sqf5\" (UniqueName: \"kubernetes.io/projected/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-kube-api-access-4sqf5\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853152 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-os-release\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853178 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-bin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853211 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853243 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-host-slash\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853277 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-tmp-dir\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853301 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-cnibin\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853318 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-os-release\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853325 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-log-socket\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853352 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovn-node-metrics-cert\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853379 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-conf\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853406 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-system-cni-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853429 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-systemd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.853457 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853455 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-etc-tuned\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853508 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnh7\" (UniqueName: \"kubernetes.io/projected/a16122cc-e651-4705-adba-ac8adbc56a48-kube-api-access-srnh7\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853544 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzn5\" (UniqueName: \"kubernetes.io/projected/fdf1115b-531b-4095-86b7-8ac9a436dd2f-kube-api-access-qkzn5\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853575 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853601 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853632 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-tmp-dir\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853624 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-system-cni-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853630 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-lib-modules\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853668 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853682 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-multus-certs\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853693 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-cnibin\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853696 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853708 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qbhr\" (UniqueName: \"kubernetes.io/projected/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kube-api-access-7qbhr\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853738 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lkb\" (UniqueName: \"kubernetes.io/projected/07361f70-d7d2-4866-8e18-3bd88e02229e-kube-api-access-r6lkb\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853763 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-slash\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853764 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-systemd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853787 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-netns\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853793 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-log-socket\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853811 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-config\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853817 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853823 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853835 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-env-overrides\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853847 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-slash\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853862 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853871 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-netns\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853908 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-script-lib\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853936 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27hng\" (UniqueName: \"kubernetes.io/projected/fd082fd1-df83-4c85-ba4b-b7b20f551f67-kube-api-access-27hng\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853936 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-multus-certs\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853976 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cnibin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.853983 2606 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854001 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-k8s-cni-cncf-io\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854023 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-multus\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854070 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-kubelet\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854091 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-sys-fs\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854125 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cnibin\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.854899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854132 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854176 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-ovn\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854208 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854237 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-system-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854206 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-cni-multus\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854263 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-os-release\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854287 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-systemd-units\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854306 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-k8s-cni-cncf-io\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854312 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-bin\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854316 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-env-overrides\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854342 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4f475ef-a8f7-4778-86ea-42db013683b6-serviceca\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854384 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-system-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854389 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-agent-certs\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854404 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07361f70-d7d2-4866-8e18-3bd88e02229e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854408 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-var-lib-kubelet\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854417 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysconfig\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854454 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-kubernetes\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854512 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-ovn\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.855702 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854474 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-os-release\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854528 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-sys-fs\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854520 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-config\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854561 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-bin\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854562 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-hosts-file\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854568 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-run-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854575 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovnkube-script-lib\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854608 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-systemd-units\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854609 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-netd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854644 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-cni-netd\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854649 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854676 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-hosts-file\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854682 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-systemd\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854682 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854707 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-host\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854731 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-registration-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854749 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4f475ef-a8f7-4778-86ea-42db013683b6-host\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.856218 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854773 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-tmp\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854796 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-iptables-alerter-script\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854818 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-socket-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854835 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-var-lib-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854814 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-registration-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854859 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7sd\" (UniqueName: \"kubernetes.io/projected/ef0a40f6-04de-4672-9770-be487916c08b-kube-api-access-lz7sd\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854875 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-conf-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854893 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-kubelet\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854909 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854922 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-var-lib-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854926 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-konnectivity-ca\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854885 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07361f70-d7d2-4866-8e18-3bd88e02229e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854942 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-conf-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854930 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-socket-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854960 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854983 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-host-kubelet\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.854986 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwqg\" (UniqueName: \"kubernetes.io/projected/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-kube-api-access-sgwqg\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.856804 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.855016 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855015 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cni-binary-copy\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.855103 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:02.355071398 +0000 UTC m=+3.074569453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855167 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-etc-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855197 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855204 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-etc-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855223 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-node-log\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855255 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-modprobe-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855269 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-run-openvswitch\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855280 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-var-lib-kubelet\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855298 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd082fd1-df83-4c85-ba4b-b7b20f551f67-node-log\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855316 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-socket-dir-parent\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855348 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-daemon-config\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855374 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-etc-kubernetes\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855402 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs4vj\" (UniqueName: \"kubernetes.io/projected/e4f475ef-a8f7-4778-86ea-42db013683b6-kube-api-access-qs4vj\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855426 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-etc-kubernetes\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855431 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.857378 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855405 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-socket-dir-parent\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855473 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-netns\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855521 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-hostroot\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855548 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-device-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855591 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-cni-dir\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855641 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-host-run-netns\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855679 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fdf1115b-531b-4095-86b7-8ac9a436dd2f-hostroot\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.855726 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-device-dir\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.856093 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-cni-binary-copy\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.856629 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fdf1115b-531b-4095-86b7-8ac9a436dd2f-multus-daemon-config\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.858139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.857832 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd082fd1-df83-4c85-ba4b-b7b20f551f67-ovn-node-metrics-cert\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.863383 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.863357 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:01.863383 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.863384 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:01.863643 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.863397 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:01.863643 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:01.863480 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:02.363462142 +0000 UTC m=+3.082960193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:01.866051 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.866026 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qbhr\" (UniqueName: \"kubernetes.io/projected/8008d0c8-6370-47a8-8cbf-9a7377a5a8b3-kube-api-access-7qbhr\") pod \"aws-ebs-csi-driver-node-zh574\" (UID: \"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:01.866214 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.866192 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hng\" (UniqueName: \"kubernetes.io/projected/fd082fd1-df83-4c85-ba4b-b7b20f551f67-kube-api-access-27hng\") pod \"ovnkube-node-pczcn\" (UID: \"fd082fd1-df83-4c85-ba4b-b7b20f551f67\") " pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:01.866919 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.866476 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lkb\" (UniqueName: \"kubernetes.io/projected/07361f70-d7d2-4866-8e18-3bd88e02229e-kube-api-access-r6lkb\") pod \"multus-additional-cni-plugins-2jnqg\" (UID: \"07361f70-d7d2-4866-8e18-3bd88e02229e\") " pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:01.866919 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.866860 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzn5\" (UniqueName: \"kubernetes.io/projected/fdf1115b-531b-4095-86b7-8ac9a436dd2f-kube-api-access-qkzn5\") pod \"multus-d5sb9\" (UID: \"fdf1115b-531b-4095-86b7-8ac9a436dd2f\") " pod="openshift-multus/multus-d5sb9" Apr 17 09:11:01.867286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.867251 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7sd\" (UniqueName: \"kubernetes.io/projected/ef0a40f6-04de-4672-9770-be487916c08b-kube-api-access-lz7sd\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:01.867379 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.867359 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sqf5\" (UniqueName: \"kubernetes.io/projected/0a1e21cd-b620-41c1-9c1d-4e6fba3925ef-kube-api-access-4sqf5\") pod \"node-resolver-wxmnk\" (UID: \"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef\") " pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:01.870243 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.870189 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" event={"ID":"37152d2cd2910a21c6149c8f6253015d","Type":"ContainerStarted","Data":"de0a60cb6f31715609bd830ebeb165a8f2f9f1c983f23d0a4ecb9dc6b1c2e440"} Apr 17 09:11:01.871276 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.871255 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" event={"ID":"955792ba1322848ad185260c969a8a99","Type":"ContainerStarted","Data":"69ee7dd921011a766458d6ce0ff2ce479bcf8b12e3b75e6bc9397fa8ff48e0a5"} Apr 17 09:11:01.956158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956070 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-modprobe-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.956158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956121 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-var-lib-kubelet\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.956158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956150 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs4vj\" (UniqueName: \"kubernetes.io/projected/e4f475ef-a8f7-4778-86ea-42db013683b6-kube-api-access-qs4vj\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956178 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-run\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956201 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-sys\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956229 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-host-slash\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956238 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-modprobe-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956259 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-var-lib-kubelet\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956273 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-conf\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956279 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-run\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956301 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-etc-tuned\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956321 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-host-slash\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956326 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srnh7\" (UniqueName: \"kubernetes.io/projected/a16122cc-e651-4705-adba-ac8adbc56a48-kube-api-access-srnh7\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956328 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-sys\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956372 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-lib-modules\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956411 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-conf\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956428 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4f475ef-a8f7-4778-86ea-42db013683b6-serviceca\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956453 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-agent-certs\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956476 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysconfig\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956518 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-kubernetes\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.957607 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956546 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-systemd\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956564 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-lib-modules\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956569 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-host\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956615 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4f475ef-a8f7-4778-86ea-42db013683b6-host\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956632 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysconfig\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956640 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-tmp\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956680 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-iptables-alerter-script\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956725 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-konnectivity-ca\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956749 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956755 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4f475ef-a8f7-4778-86ea-42db013683b6-host\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956774 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwqg\" (UniqueName: \"kubernetes.io/projected/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-kube-api-access-sgwqg\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.956966 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4f475ef-a8f7-4778-86ea-42db013683b6-serviceca\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957111 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-sysctl-d\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957166 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-systemd\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957209 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-host\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957249 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a16122cc-e651-4705-adba-ac8adbc56a48-etc-kubernetes\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957267 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-iptables-alerter-script\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.958428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.957452 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-konnectivity-ca\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.959215 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.958977 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-etc-tuned\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.959215 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.959030 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a16122cc-e651-4705-adba-ac8adbc56a48-tmp\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.959324 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.959297 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9-agent-certs\") pod \"konnectivity-agent-8ntkx\" (UID: \"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9\") " pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:01.969089 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.969059 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwqg\" (UniqueName: \"kubernetes.io/projected/dfb32cbd-12dc-4c1b-add8-12bb56a19a40-kube-api-access-sgwqg\") pod \"iptables-alerter-mnb85\" (UID: \"dfb32cbd-12dc-4c1b-add8-12bb56a19a40\") " pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:01.969362 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.969342 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnh7\" (UniqueName: \"kubernetes.io/projected/a16122cc-e651-4705-adba-ac8adbc56a48-kube-api-access-srnh7\") pod \"tuned-dcjr8\" (UID: \"a16122cc-e651-4705-adba-ac8adbc56a48\") " pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:01.969423 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:01.969340 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs4vj\" (UniqueName: \"kubernetes.io/projected/e4f475ef-a8f7-4778-86ea-42db013683b6-kube-api-access-qs4vj\") pod \"node-ca-g4rrd\" (UID: \"e4f475ef-a8f7-4778-86ea-42db013683b6\") " pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:02.044813 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.044779 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" Apr 17 09:11:02.053547 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.053510 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" Apr 17 09:11:02.062409 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.062383 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5sb9" Apr 17 09:11:02.069074 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.069048 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:02.082686 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.082657 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wxmnk" Apr 17 09:11:02.091428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.091396 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4rrd" Apr 17 09:11:02.098097 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.098077 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:02.105716 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.105693 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" Apr 17 09:11:02.113356 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.113334 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mnb85" Apr 17 09:11:02.359704 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.359611 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:02.359892 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.359736 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:02.359892 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.359796 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:03.359779862 +0000 UTC m=+4.079277917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:02.460463 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.460421 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:02.460665 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.460621 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:02.460665 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.460646 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:02.460665 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.460659 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:02.460829 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:02.460726 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:03.460707446 +0000 UTC m=+4.180205504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:02.612995 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.612950 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd082fd1_df83_4c85_ba4b_b7b20f551f67.slice/crio-416d8bd4b510de65c1d4acba8f0e43f490ea335f10c1f62cbf5e88c3040705dd WatchSource:0}: Error finding container 416d8bd4b510de65c1d4acba8f0e43f490ea335f10c1f62cbf5e88c3040705dd: Status 404 returned error can't find the container with id 416d8bd4b510de65c1d4acba8f0e43f490ea335f10c1f62cbf5e88c3040705dd Apr 17 09:11:02.614895 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.614868 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda16122cc_e651_4705_adba_ac8adbc56a48.slice/crio-8543fd6888e236b92d5179b7b8d6a78ba3dd0b9eda5cce19f3b5a81233df4d2a WatchSource:0}: Error finding container 8543fd6888e236b92d5179b7b8d6a78ba3dd0b9eda5cce19f3b5a81233df4d2a: Status 404 returned error can't find the container with id 8543fd6888e236b92d5179b7b8d6a78ba3dd0b9eda5cce19f3b5a81233df4d2a Apr 17 09:11:02.618350 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.618318 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf1115b_531b_4095_86b7_8ac9a436dd2f.slice/crio-60a69c2e5bfc672f82f569837cf7a7d5edb681b0ded4e610126ed96af9c5bb7d WatchSource:0}: Error finding container 60a69c2e5bfc672f82f569837cf7a7d5edb681b0ded4e610126ed96af9c5bb7d: Status 404 returned error can't find the container with id 60a69c2e5bfc672f82f569837cf7a7d5edb681b0ded4e610126ed96af9c5bb7d Apr 17 09:11:02.619260 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.619195 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1e21cd_b620_41c1_9c1d_4e6fba3925ef.slice/crio-b139c3cefcfb3d3a25639c1554b944684a6dbd60e630cd721cfe4dd71e5830f2 WatchSource:0}: Error finding container b139c3cefcfb3d3a25639c1554b944684a6dbd60e630cd721cfe4dd71e5830f2: Status 404 returned error can't find the container with id b139c3cefcfb3d3a25639c1554b944684a6dbd60e630cd721cfe4dd71e5830f2 Apr 17 09:11:02.620292 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.620266 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07361f70_d7d2_4866_8e18_3bd88e02229e.slice/crio-62f645f3e3acd524fada62988486a484c490bed2a3bb36d83ab783ec81283988 WatchSource:0}: Error finding container 62f645f3e3acd524fada62988486a484c490bed2a3bb36d83ab783ec81283988: Status 404 returned error can't find the container with id 62f645f3e3acd524fada62988486a484c490bed2a3bb36d83ab783ec81283988 Apr 17 09:11:02.621219 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.621195 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb32cbd_12dc_4c1b_add8_12bb56a19a40.slice/crio-3149f416c734d9ad7fb86faf558ff81f2515ef53d51cd7fe801d03d07e957bcc WatchSource:0}: Error finding container 3149f416c734d9ad7fb86faf558ff81f2515ef53d51cd7fe801d03d07e957bcc: Status 404 returned error can't find the container with id 3149f416c734d9ad7fb86faf558ff81f2515ef53d51cd7fe801d03d07e957bcc Apr 17 09:11:02.622836 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.622751 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f475ef_a8f7_4778_86ea_42db013683b6.slice/crio-faf779973b4e473a770b25707730ad753f581ac9bb819a5db0eaf9e5accca618 WatchSource:0}: Error finding container faf779973b4e473a770b25707730ad753f581ac9bb819a5db0eaf9e5accca618: Status 404 returned error can't find the container with id faf779973b4e473a770b25707730ad753f581ac9bb819a5db0eaf9e5accca618 Apr 17 09:11:02.625106 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.624141 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a59d22a_1ddf_48a4_b7d4_91e233d1c8b9.slice/crio-93a73b78c0349f709820552d1e835e6ad6626b3dc78f42facaabf9a489d062d0 WatchSource:0}: Error finding container 93a73b78c0349f709820552d1e835e6ad6626b3dc78f42facaabf9a489d062d0: Status 404 returned error can't find the container with id 93a73b78c0349f709820552d1e835e6ad6626b3dc78f42facaabf9a489d062d0 Apr 17 09:11:02.625522 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:02.625480 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8008d0c8_6370_47a8_8cbf_9a7377a5a8b3.slice/crio-a0239e27a77778f0b29e614be14d20ece015b9c4612721a3b1beb173466d2f4d WatchSource:0}: Error finding container a0239e27a77778f0b29e614be14d20ece015b9c4612721a3b1beb173466d2f4d: Status 404 returned error can't find the container with id a0239e27a77778f0b29e614be14d20ece015b9c4612721a3b1beb173466d2f4d Apr 17 09:11:02.778829 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.778787 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:00 +0000 UTC" deadline="2027-12-11 10:01:31.906725106 +0000 UTC" Apr 17 09:11:02.778829 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.778824 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14472h50m29.127903801s" Apr 17 09:11:02.873694 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.873572 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4rrd" event={"ID":"e4f475ef-a8f7-4778-86ea-42db013683b6","Type":"ContainerStarted","Data":"faf779973b4e473a770b25707730ad753f581ac9bb819a5db0eaf9e5accca618"} Apr 17 09:11:02.874619 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.874589 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8ntkx" event={"ID":"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9","Type":"ContainerStarted","Data":"93a73b78c0349f709820552d1e835e6ad6626b3dc78f42facaabf9a489d062d0"} Apr 17 09:11:02.875584 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.875551 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mnb85" event={"ID":"dfb32cbd-12dc-4c1b-add8-12bb56a19a40","Type":"ContainerStarted","Data":"3149f416c734d9ad7fb86faf558ff81f2515ef53d51cd7fe801d03d07e957bcc"} Apr 17 09:11:02.876506 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.876472 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerStarted","Data":"62f645f3e3acd524fada62988486a484c490bed2a3bb36d83ab783ec81283988"} Apr 17 09:11:02.877365 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.877346 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5sb9" event={"ID":"fdf1115b-531b-4095-86b7-8ac9a436dd2f","Type":"ContainerStarted","Data":"60a69c2e5bfc672f82f569837cf7a7d5edb681b0ded4e610126ed96af9c5bb7d"} Apr 17 09:11:02.878215 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.878197 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" event={"ID":"a16122cc-e651-4705-adba-ac8adbc56a48","Type":"ContainerStarted","Data":"8543fd6888e236b92d5179b7b8d6a78ba3dd0b9eda5cce19f3b5a81233df4d2a"} Apr 17 09:11:02.879550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.879525 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" event={"ID":"955792ba1322848ad185260c969a8a99","Type":"ContainerStarted","Data":"2ca98f408179392a9baf91a30ea58762e094bbc4a394dfd012dedab084237fe6"} Apr 17 09:11:02.880479 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.880461 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" event={"ID":"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3","Type":"ContainerStarted","Data":"a0239e27a77778f0b29e614be14d20ece015b9c4612721a3b1beb173466d2f4d"} Apr 17 09:11:02.881469 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.881429 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"416d8bd4b510de65c1d4acba8f0e43f490ea335f10c1f62cbf5e88c3040705dd"} Apr 17 09:11:02.882320 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.882303 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wxmnk" event={"ID":"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef","Type":"ContainerStarted","Data":"b139c3cefcfb3d3a25639c1554b944684a6dbd60e630cd721cfe4dd71e5830f2"} Apr 17 09:11:02.892907 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:02.892866 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-18.ec2.internal" podStartSLOduration=1.892855137 podStartE2EDuration="1.892855137s" podCreationTimestamp="2026-04-17 09:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:02.892675945 +0000 UTC m=+3.612174003" watchObservedRunningTime="2026-04-17 09:11:02.892855137 +0000 UTC m=+3.612353264" Apr 17 09:11:03.366740 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.366702 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:03.366907 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.366878 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:03.366974 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.366947 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:05.366928605 +0000 UTC m=+6.086426661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:03.467762 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.467675 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:03.467927 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.467827 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:03.467927 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.467854 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:03.467927 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.467866 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:03.467927 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.467926 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:05.467908633 +0000 UTC m=+6.187406671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:03.865889 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.865806 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:03.866337 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.865947 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:03.878576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.878534 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:03.878805 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:03.878751 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:03.918526 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.917268 2606 generic.go:358] "Generic (PLEG): container finished" podID="37152d2cd2910a21c6149c8f6253015d" containerID="387ef20e354310bac8c0a3440af7ee98e2475ee2e36a59cbd80a92b8478fc83c" exitCode=0 Apr 17 09:11:03.918526 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:03.918274 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" event={"ID":"37152d2cd2910a21c6149c8f6253015d","Type":"ContainerDied","Data":"387ef20e354310bac8c0a3440af7ee98e2475ee2e36a59cbd80a92b8478fc83c"} Apr 17 09:11:04.945025 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:04.944987 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" event={"ID":"37152d2cd2910a21c6149c8f6253015d","Type":"ContainerStarted","Data":"7827e4be8d90fb8628eac8e83b9604b6dac8f5ac31919060bd8625c71cc4d4ed"} Apr 17 09:11:04.959405 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:04.959126 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-18.ec2.internal" podStartSLOduration=3.959106065 podStartE2EDuration="3.959106065s" podCreationTimestamp="2026-04-17 09:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:04.958155949 +0000 UTC m=+5.677654010" watchObservedRunningTime="2026-04-17 09:11:04.959106065 +0000 UTC m=+5.678604125" Apr 17 09:11:05.384168 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:05.384125 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:05.384370 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.384325 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:05.384432 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.384392 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:09.384373695 +0000 UTC m=+10.103871734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:05.484878 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:05.484831 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:05.485049 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.485006 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:05.485049 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.485027 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:05.485049 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.485040 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:05.485207 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.485106 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:09.48508477 +0000 UTC m=+10.204582805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:05.866701 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:05.866616 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:05.866864 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.866774 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:05.867396 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:05.867370 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:05.867525 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:05.867500 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:07.418035 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.418004 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vm9k6"] Apr 17 09:11:07.420572 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.420546 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.420724 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:07.420634 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:07.502089 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.502049 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-dbus\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.502252 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.502105 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-kubelet-config\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.502252 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.502172 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602621 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.602580 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-dbus\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602801 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.602641 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-kubelet-config\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602801 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.602716 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602801 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.602783 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-dbus\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602953 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.602864 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0db5b5a-0bf0-459d-ba1b-46054d880831-kubelet-config\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:07.602953 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:07.602874 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:07.603057 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:07.602951 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:08.102929771 +0000 UTC m=+8.822427820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:07.866544 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.866418 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:07.866697 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:07.866603 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:07.866757 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:07.866726 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:07.866914 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:07.866866 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:08.110125 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:08.110074 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:08.110302 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:08.110256 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:08.110365 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:08.110328 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:09.110307655 +0000 UTC m=+9.829805705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:08.866266 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:08.865900 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:08.866266 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:08.866041 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:09.119144 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:09.119053 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:09.119320 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.119246 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:09.119320 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.119305 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:11.119287109 +0000 UTC m=+11.838785163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:09.421347 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:09.421022 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:09.421347 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.421187 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.421347 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.421254 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:17.421236609 +0000 UTC m=+18.140734644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.522035 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:09.521994 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:09.522228 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.522194 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:09.522228 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.522227 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:09.522366 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.522242 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.522366 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.522311 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:17.522290787 +0000 UTC m=+18.241788830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.867625 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:09.866906 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:09.867625 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.867038 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:09.867625 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:09.867412 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:09.867625 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:09.867527 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:10.865780 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:10.865746 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:10.865962 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:10.865883 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:11.137294 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:11.137204 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:11.137735 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:11.137362 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:11.137735 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:11.137441 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:15.13741953 +0000 UTC m=+15.856917571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:11.869115 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:11.869086 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:11.869286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:11.869086 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:11.869286 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:11.869215 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:11.869286 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:11.869255 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:12.865764 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:12.865735 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:12.866232 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:12.865855 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:13.866423 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:13.866388 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:13.866423 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:13.866426 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:13.866956 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:13.866522 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:13.866956 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:13.866658 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:14.866046 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:14.866010 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:14.866252 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:14.866156 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:15.165593 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:15.165553 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:15.166004 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:15.165751 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:15.166004 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:15.165825 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:23.165809553 +0000 UTC m=+23.885307587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:15.869592 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:15.869555 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:15.869592 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:15.869569 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:15.869812 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:15.869681 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:15.869868 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:15.869819 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:16.866028 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:16.865985 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:16.866485 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:16.866147 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:17.486513 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:17.486458 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:17.486693 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.486622 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:17.486746 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.486703 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.48667811 +0000 UTC m=+34.206176160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:17.586999 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:17.586958 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:17.587153 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.587136 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:17.587221 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.587159 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:17.587221 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.587170 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:17.587315 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.587241 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.58722067 +0000 UTC m=+34.306718722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:17.869132 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:17.869099 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:17.869563 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:17.869099 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:17.869563 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.869216 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:17.869563 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:17.869298 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:18.866306 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:18.866273 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:18.866463 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:18.866385 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:19.867173 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:19.867080 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:19.867692 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:19.867243 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:19.867692 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:19.867272 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:19.867692 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:19.867383 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:20.866678 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.866304 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:20.866852 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:20.866791 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:20.972431 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.972400 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" event={"ID":"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3","Type":"ContainerStarted","Data":"71077ec8d641b07e5b84d291e6d8d713db12de1f4b7948306dbbddd8610066d6"} Apr 17 09:11:20.975155 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975128 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:11:20.975546 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975484 2606 generic.go:358] "Generic (PLEG): container finished" podID="fd082fd1-df83-4c85-ba4b-b7b20f551f67" containerID="c2f7dbd4c017c0bc3bba9ad7f43b8ab27cd5c3c0b9465fd578c267816a14839e" exitCode=1 Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975559 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"900b53527b2e855d889b22f7f89ee7f78e5b5369df6c210bc8cf898aa18ea213"} Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975592 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"4e48921a83e59e3a250449f57659dac757ddc3a901f345df8abc2e9744066846"} Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975604 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"83863cbf6eaccad28760f049e969c4a6b09cc19d9a1d8413addc6e905e3918a2"} Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975616 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"849f16ac5401f8e6b48126b3f96fb331ffe7c969d97c78019c99c641c745f042"} Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975630 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerDied","Data":"c2f7dbd4c017c0bc3bba9ad7f43b8ab27cd5c3c0b9465fd578c267816a14839e"} Apr 17 09:11:20.975679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.975651 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"037d878e12278cb05af177761d32e878007afe116f274c177ce19448bb6dd2cd"} Apr 17 09:11:20.976983 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.976958 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wxmnk" event={"ID":"0a1e21cd-b620-41c1-9c1d-4e6fba3925ef","Type":"ContainerStarted","Data":"5d0c27681b2f60dce4e891eefa26efc55687b4bf7b16859a968bd4550a1a9bb7"} Apr 17 09:11:20.978369 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.978337 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4rrd" event={"ID":"e4f475ef-a8f7-4778-86ea-42db013683b6","Type":"ContainerStarted","Data":"1e576cd8eba4459caccf8f0300513c56a51274317718dd9d61e0c052cb014ef7"} Apr 17 09:11:20.979796 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.979768 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8ntkx" event={"ID":"4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9","Type":"ContainerStarted","Data":"84ee1c550f853922eaa89a24f5cd42a2b50a4a93615ee30a5dd7148c40e73c81"} Apr 17 09:11:20.981291 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.981258 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="b415fa003c3be2e17545444509a129fd050296a218d8315246af45353b8deffc" exitCode=0 Apr 17 09:11:20.981421 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.981344 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"b415fa003c3be2e17545444509a129fd050296a218d8315246af45353b8deffc"} Apr 17 09:11:20.982830 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.982766 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5sb9" event={"ID":"fdf1115b-531b-4095-86b7-8ac9a436dd2f","Type":"ContainerStarted","Data":"8030094301c3de5969a57acce9b84fc1981b074cc0cc4dd6bf0388bc2112de26"} Apr 17 09:11:20.984207 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.984184 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" event={"ID":"a16122cc-e651-4705-adba-ac8adbc56a48","Type":"ContainerStarted","Data":"7d2fdb65f9fccad370e5f0060729fa97ce52cb69d4f52d6744003152cca8339c"} Apr 17 09:11:20.993908 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:20.993865 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wxmnk" podStartSLOduration=3.833661644 podStartE2EDuration="20.993851028s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.621280726 +0000 UTC m=+3.340778763" lastFinishedPulling="2026-04-17 09:11:19.781470113 +0000 UTC m=+20.500968147" observedRunningTime="2026-04-17 09:11:20.990459112 +0000 UTC m=+21.709957169" watchObservedRunningTime="2026-04-17 09:11:20.993851028 +0000 UTC m=+21.713349085" Apr 17 09:11:21.002933 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.002888 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8ntkx" podStartSLOduration=4.140730833 podStartE2EDuration="21.002874699s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.628840448 +0000 UTC m=+3.348338493" lastFinishedPulling="2026-04-17 09:11:19.490984304 +0000 UTC m=+20.210482359" observedRunningTime="2026-04-17 09:11:21.002643664 +0000 UTC m=+21.722141720" watchObservedRunningTime="2026-04-17 09:11:21.002874699 +0000 UTC m=+21.722372799" Apr 17 09:11:21.016750 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.016690 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d5sb9" podStartSLOduration=4.819935249 podStartE2EDuration="22.0166769s" podCreationTimestamp="2026-04-17 09:10:59 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.620301568 +0000 UTC m=+3.339799605" lastFinishedPulling="2026-04-17 09:11:19.817043221 +0000 UTC m=+20.536541256" observedRunningTime="2026-04-17 09:11:21.016656937 +0000 UTC m=+21.736154997" watchObservedRunningTime="2026-04-17 09:11:21.0166769 +0000 UTC m=+21.736174948" Apr 17 09:11:21.029291 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.029242 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dcjr8" podStartSLOduration=3.864849384 podStartE2EDuration="21.029228168s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.617092676 +0000 UTC m=+3.336590711" lastFinishedPulling="2026-04-17 09:11:19.781471445 +0000 UTC m=+20.500969495" observedRunningTime="2026-04-17 09:11:21.028872715 +0000 UTC m=+21.748370773" watchObservedRunningTime="2026-04-17 09:11:21.029228168 +0000 UTC m=+21.748726225" Apr 17 09:11:21.602989 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.602960 2606 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 09:11:21.805210 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.805078 2606 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T09:11:21.602983515Z","UUID":"3b4d8658-986c-4082-988a-aa412278363c","Handler":null,"Name":"","Endpoint":""} Apr 17 09:11:21.807549 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.807525 2606 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 09:11:21.807549 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.807554 2606 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 09:11:21.865570 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.865537 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:21.865825 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:21.865659 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:21.865825 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.865709 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:21.865825 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:21.865800 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:21.987087 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.987049 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mnb85" event={"ID":"dfb32cbd-12dc-4c1b-add8-12bb56a19a40","Type":"ContainerStarted","Data":"3db2db4bccda225ed1022457e764986daf24c175314b98d7782a210f9f82a4a2"} Apr 17 09:11:21.988682 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.988655 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" event={"ID":"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3","Type":"ContainerStarted","Data":"d72126db1525d1df320f7b1d353840a5105f5b413148c433f29b89fb518305dd"} Apr 17 09:11:21.999835 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:21.999790 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g4rrd" podStartSLOduration=9.510452532 podStartE2EDuration="21.999775307s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.625740671 +0000 UTC m=+3.345238714" lastFinishedPulling="2026-04-17 09:11:15.115063437 +0000 UTC m=+15.834561489" observedRunningTime="2026-04-17 09:11:21.057709034 +0000 UTC m=+21.777207091" watchObservedRunningTime="2026-04-17 09:11:21.999775307 +0000 UTC m=+22.719273413" Apr 17 09:11:22.000152 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:22.000130 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mnb85" podStartSLOduration=4.977229453 podStartE2EDuration="22.000125556s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.624757941 +0000 UTC m=+3.344255983" lastFinishedPulling="2026-04-17 09:11:19.647654033 +0000 UTC m=+20.367152086" observedRunningTime="2026-04-17 09:11:21.999561612 +0000 UTC m=+22.719059669" watchObservedRunningTime="2026-04-17 09:11:22.000125556 +0000 UTC m=+22.719623613" Apr 17 09:11:22.866134 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:22.866097 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:22.866323 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:22.866220 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:23.233318 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.233107 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:23.233758 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:23.233268 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:23.233758 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:23.233429 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret podName:c0db5b5a-0bf0-459d-ba1b-46054d880831 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:39.233405595 +0000 UTC m=+39.952903652 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret") pod "global-pull-secret-syncer-vm9k6" (UID: "c0db5b5a-0bf0-459d-ba1b-46054d880831") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:23.811760 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.811721 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:23.866628 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.866586 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:23.866826 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:23.866710 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:23.866826 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.866774 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:23.866942 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:23.866894 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:23.995446 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.995412 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" event={"ID":"8008d0c8-6370-47a8-8cbf-9a7377a5a8b3","Type":"ContainerStarted","Data":"9569b32dddb839f871509fbc65cf2a079ab6ff88a3dbabf3099b85c20a2423cd"} Apr 17 09:11:23.998786 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.998721 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:11:23.999143 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:23.999105 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"164c5cfbbd35e62a4935c6c45d6ecbc2769a5800190ee691e8a8da8a27ae9aae"} Apr 17 09:11:24.027404 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:24.027355 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zh574" podStartSLOduration=4.747789317 podStartE2EDuration="25.027336944s" podCreationTimestamp="2026-04-17 09:10:59 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.628128934 +0000 UTC m=+3.347626972" lastFinishedPulling="2026-04-17 09:11:22.90767655 +0000 UTC m=+23.627174599" observedRunningTime="2026-04-17 09:11:24.026848384 +0000 UTC m=+24.746346442" watchObservedRunningTime="2026-04-17 09:11:24.027336944 +0000 UTC m=+24.746835000" Apr 17 09:11:24.866350 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:24.866316 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:24.866913 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:24.866428 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:25.764895 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:25.764692 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:25.765377 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:25.765352 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:25.866348 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:25.866320 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:25.866516 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:25.866320 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:25.866516 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:25.866427 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:25.867095 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:25.866549 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:26.004786 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.004748 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="9e2ca4a129e742f440bd28c6e4e5faaa80aed667fc455e17bd5f9f348ea41bfc" exitCode=0 Apr 17 09:11:26.004948 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.004837 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"9e2ca4a129e742f440bd28c6e4e5faaa80aed667fc455e17bd5f9f348ea41bfc"} Apr 17 09:11:26.008184 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.008166 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:11:26.008548 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.008527 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"d9af44511a48438fb3fc7832e9ba1170fb2635a68f99c576ed473f933d3b42b7"} Apr 17 09:11:26.008802 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.008783 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:26.008873 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.008814 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:26.008946 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.008934 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:26.009057 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.009041 2606 scope.go:117] "RemoveContainer" containerID="c2f7dbd4c017c0bc3bba9ad7f43b8ab27cd5c3c0b9465fd578c267816a14839e" Apr 17 09:11:26.009391 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.009371 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8ntkx" Apr 17 09:11:26.024059 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.024028 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:26.024944 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.024923 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:11:26.865917 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:26.865881 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:26.865917 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:26.866007 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:27.013219 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.013186 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerStarted","Data":"10c74a8ccc421bf65c6c44cea7e215aca8e7f6d69fde1a987c2a146267ae7328"} Apr 17 09:11:27.016788 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.016769 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:11:27.017169 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.017121 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" event={"ID":"fd082fd1-df83-4c85-ba4b-b7b20f551f67","Type":"ContainerStarted","Data":"49bc4eea089c48653120cfb7b297ff7b0687f66fc41307dd8abbb9b4c0b6338a"} Apr 17 09:11:27.065869 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.065804 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" podStartSLOduration=9.805386546 podStartE2EDuration="27.065780711s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.615113121 +0000 UTC m=+3.334611173" lastFinishedPulling="2026-04-17 09:11:19.875507289 +0000 UTC m=+20.595005338" observedRunningTime="2026-04-17 09:11:27.065087653 +0000 UTC m=+27.784585711" watchObservedRunningTime="2026-04-17 09:11:27.065780711 +0000 UTC m=+27.785278768" Apr 17 09:11:27.331613 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.331541 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vm9k6"] Apr 17 09:11:27.331748 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.331665 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:27.331783 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:27.331748 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:27.337177 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.337144 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2w27v"] Apr 17 09:11:27.337297 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.337290 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:27.337426 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:27.337405 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:27.338063 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.338039 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7l5df"] Apr 17 09:11:27.338150 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:27.338140 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:27.338223 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:27.338208 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:28.020960 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:28.020926 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="10c74a8ccc421bf65c6c44cea7e215aca8e7f6d69fde1a987c2a146267ae7328" exitCode=0 Apr 17 09:11:28.021384 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:28.021009 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"10c74a8ccc421bf65c6c44cea7e215aca8e7f6d69fde1a987c2a146267ae7328"} Apr 17 09:11:28.865909 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:28.865868 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:28.866064 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:28.865868 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:28.866064 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:28.865992 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:28.866064 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:28.865876 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:28.866064 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:28.866052 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:28.866222 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:28.866132 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:29.024346 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:29.024260 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="155778d6bcb39be682b4c43570d5aa98269634bd91b372d5628f699877a6506c" exitCode=0 Apr 17 09:11:29.024346 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:29.024319 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"155778d6bcb39be682b4c43570d5aa98269634bd91b372d5628f699877a6506c"} Apr 17 09:11:30.866314 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:30.866115 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:30.866799 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:30.866159 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:30.866799 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:30.866431 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:30.866799 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:30.866163 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:30.866799 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:30.866509 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:30.866799 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:30.866593 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:32.865572 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:32.865533 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:32.866176 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:32.865531 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:32.866176 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:32.865670 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:11:32.866176 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:32.865531 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:32.866176 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:32.865751 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vm9k6" podUID="c0db5b5a-0bf0-459d-ba1b-46054d880831" Apr 17 09:11:32.866176 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:32.865810 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7l5df" podUID="252902a0-d0ed-496b-bcbb-6dc20ec3c9d4" Apr 17 09:11:33.077916 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.077845 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-18.ec2.internal" event="NodeReady" Apr 17 09:11:33.078100 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.077994 2606 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 09:11:33.111506 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.111454 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj"] Apr 17 09:11:33.113612 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.113582 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.115171 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.115126 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl"] Apr 17 09:11:33.116413 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.116385 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 09:11:33.116591 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.116556 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 09:11:33.116658 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.116635 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vngck\"" Apr 17 09:11:33.117102 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.117081 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c"] Apr 17 09:11:33.117221 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.117206 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.118724 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.118705 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:11:33.118849 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.118832 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.119670 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119649 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 09:11:33.119830 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119804 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 09:11:33.119830 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119822 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 09:11:33.120002 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119867 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 09:11:33.120002 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119820 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 09:11:33.120002 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.119919 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 09:11:33.120158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.120046 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 09:11:33.120597 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.120578 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j"] Apr 17 09:11:33.120739 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.120718 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.121385 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.121328 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8nwrs\"" Apr 17 09:11:33.121482 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.121434 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 09:11:33.122707 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.122671 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.123735 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.123517 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 09:11:33.124286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.123982 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9xfs\"" Apr 17 09:11:33.124286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.124125 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 09:11:33.124286 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.124172 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 09:11:33.125796 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.125775 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 09:11:33.128192 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.126897 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj"] Apr 17 09:11:33.128521 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.128335 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6c4sw"] Apr 17 09:11:33.130844 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.130685 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c"] Apr 17 09:11:33.130844 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.130712 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j"] Apr 17 09:11:33.130844 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.130788 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 09:11:33.130844 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.130827 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.134348 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.134086 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl"] Apr 17 09:11:33.134348 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.134116 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:11:33.134348 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.134342 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:11:33.135028 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.134685 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 09:11:33.135028 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.134885 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 09:11:33.146261 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.146197 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6c4sw"] Apr 17 09:11:33.206744 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206707 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.206744 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206745 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206774 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206800 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9223787d-c7eb-4652-8232-9e4d13e72e36-tmp\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206835 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206853 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb63c154-05d7-4e31-bcd5-6b2da69d604c-config-volume\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206877 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206896 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb63c154-05d7-4e31-bcd5-6b2da69d604c-tmp-dir\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206910 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.206969 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.206924 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207036 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207069 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207092 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgb8\" (UniqueName: \"kubernetes.io/projected/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-kube-api-access-hlgb8\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207111 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207131 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207191 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmt22\" (UniqueName: \"kubernetes.io/projected/cb63c154-05d7-4e31-bcd5-6b2da69d604c-kube-api-access-jmt22\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.207229 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207220 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857df\" (UniqueName: \"kubernetes.io/projected/11812f01-f01e-4a9f-a011-51e2ede8fbc2-kube-api-access-857df\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207278 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207318 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207348 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207371 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpjhm\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207403 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207419 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207436 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9223787d-c7eb-4652-8232-9e4d13e72e36-klusterlet-config\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.207550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.207453 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhgb2\" (UniqueName: \"kubernetes.io/projected/9223787d-c7eb-4652-8232-9e4d13e72e36-kube-api-access-fhgb2\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.223123 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.223088 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-snccl"] Apr 17 09:11:33.225165 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.225143 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.228586 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.228552 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 09:11:33.228586 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.228553 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:11:33.228786 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.228560 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 09:11:33.228786 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.228682 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 09:11:33.233650 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.233604 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-snccl"] Apr 17 09:11:33.308067 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308033 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.308247 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308106 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308247 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308135 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9223787d-c7eb-4652-8232-9e4d13e72e36-klusterlet-config\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.308247 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308164 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.308247 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308190 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.308247 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308216 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308561 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.308337 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:33.308561 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308349 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.308561 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.308405 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.80838369 +0000 UTC m=+34.527881734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308570 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhgb2\" (UniqueName: \"kubernetes.io/projected/9223787d-c7eb-4652-8232-9e4d13e72e36-kube-api-access-fhgb2\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308605 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308631 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308653 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9223787d-c7eb-4652-8232-9e4d13e72e36-tmp\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308677 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308721 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308703 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtxb\" (UniqueName: \"kubernetes.io/projected/2e6cf676-b554-4b97-99de-d1ff810ef911-kube-api-access-9gtxb\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308736 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb63c154-05d7-4e31-bcd5-6b2da69d604c-config-volume\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308761 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308787 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb63c154-05d7-4e31-bcd5-6b2da69d604c-tmp-dir\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308796 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308814 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308869 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.308896 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.308909 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308930 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.308951 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.808934616 +0000 UTC m=+34.528432656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:33.308977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.308980 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309015 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309044 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgb8\" (UniqueName: \"kubernetes.io/projected/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-kube-api-access-hlgb8\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309068 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309113 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmt22\" (UniqueName: \"kubernetes.io/projected/cb63c154-05d7-4e31-bcd5-6b2da69d604c-kube-api-access-jmt22\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309141 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-857df\" (UniqueName: \"kubernetes.io/projected/11812f01-f01e-4a9f-a011-51e2ede8fbc2-kube-api-access-857df\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309175 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.309455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.309206 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpjhm\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.310531 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.310427 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.311152 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.311158 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.311252 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.811230851 +0000 UTC m=+34.530728893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.311422 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.311572 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9223787d-c7eb-4652-8232-9e4d13e72e36-tmp\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.311870 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.311772 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb63c154-05d7-4e31-bcd5-6b2da69d604c-config-volume\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.312291 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.311997 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb63c154-05d7-4e31-bcd5-6b2da69d604c-tmp-dir\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.312599 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.312563 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.313845 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.313798 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.314055 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.314030 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-hub\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.314440 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.314402 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.314543 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.314458 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.314661 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.314641 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.314747 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.314654 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9223787d-c7eb-4652-8232-9e4d13e72e36-klusterlet-config\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.315344 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.315304 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.317147 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.316433 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/11812f01-f01e-4a9f-a011-51e2ede8fbc2-ca\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.317684 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.317663 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.318369 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.318045 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhgb2\" (UniqueName: \"kubernetes.io/projected/9223787d-c7eb-4652-8232-9e4d13e72e36-kube-api-access-fhgb2\") pod \"klusterlet-addon-workmgr-55c588f6cb-dbp8j\" (UID: \"9223787d-c7eb-4652-8232-9e4d13e72e36\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.318825 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.318797 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgb8\" (UniqueName: \"kubernetes.io/projected/2bce8a2f-a74a-4132-9eb3-8da275ed9ba3-kube-api-access-hlgb8\") pod \"managed-serviceaccount-addon-agent-58544c9c65-x6c6c\" (UID: \"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.319366 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.319346 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmt22\" (UniqueName: \"kubernetes.io/projected/cb63c154-05d7-4e31-bcd5-6b2da69d604c-kube-api-access-jmt22\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.319715 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.319693 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpjhm\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.320683 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.320660 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-857df\" (UniqueName: \"kubernetes.io/projected/11812f01-f01e-4a9f-a011-51e2ede8fbc2-kube-api-access-857df\") pod \"cluster-proxy-proxy-agent-75d58b577-r96wl\" (UID: \"11812f01-f01e-4a9f-a011-51e2ede8fbc2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.410197 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.410160 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.410395 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.410331 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:33.410395 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.410361 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtxb\" (UniqueName: \"kubernetes.io/projected/2e6cf676-b554-4b97-99de-d1ff810ef911-kube-api-access-9gtxb\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.410531 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.410414 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:33.910393357 +0000 UTC m=+34.629891393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:33.419094 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.419062 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtxb\" (UniqueName: \"kubernetes.io/projected/2e6cf676-b554-4b97-99de-d1ff810ef911-kube-api-access-9gtxb\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.446135 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.446097 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:11:33.454923 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.454895 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" Apr 17 09:11:33.476817 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.476782 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:33.512386 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.511635 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:33.512386 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.511887 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:33.512386 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.511957 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:12:05.511938999 +0000 UTC m=+66.231437057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:33.612330 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.612293 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:33.612480 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.612465 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:33.612578 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.612481 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:33.612578 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.612505 2606 projected.go:194] Error preparing data for projected volume kube-api-access-xz9s4 for pod openshift-network-diagnostics/network-check-target-7l5df: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:33.612578 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.612563 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4 podName:252902a0-d0ed-496b-bcbb-6dc20ec3c9d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:05.612547896 +0000 UTC m=+66.332045947 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xz9s4" (UniqueName: "kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4") pod "network-check-target-7l5df" (UID: "252902a0-d0ed-496b-bcbb-6dc20ec3c9d4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:33.636694 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.636410 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c"] Apr 17 09:11:33.642525 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:33.642470 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bce8a2f_a74a_4132_9eb3_8da275ed9ba3.slice/crio-30dabb092c2c72f09ba15ced678df2023ca32a33f390bd7235c083efcfd2acca WatchSource:0}: Error finding container 30dabb092c2c72f09ba15ced678df2023ca32a33f390bd7235c083efcfd2acca: Status 404 returned error can't find the container with id 30dabb092c2c72f09ba15ced678df2023ca32a33f390bd7235c083efcfd2acca Apr 17 09:11:33.644441 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.644402 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl"] Apr 17 09:11:33.647822 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:33.647793 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11812f01_f01e_4a9f_a011_51e2ede8fbc2.slice/crio-2eb6cb72a947c14954630b2a6f1b362d3a164f341d59d13cdd40cb15a9d3b3ef WatchSource:0}: Error finding container 2eb6cb72a947c14954630b2a6f1b362d3a164f341d59d13cdd40cb15a9d3b3ef: Status 404 returned error can't find the container with id 2eb6cb72a947c14954630b2a6f1b362d3a164f341d59d13cdd40cb15a9d3b3ef Apr 17 09:11:33.657331 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.657300 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j"] Apr 17 09:11:33.660578 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:33.660468 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9223787d_c7eb_4652_8232_9e4d13e72e36.slice/crio-eb266a60122f77af3f2325bb4669680b0a79ae340f94e35b90d489458d882c79 WatchSource:0}: Error finding container eb266a60122f77af3f2325bb4669680b0a79ae340f94e35b90d489458d882c79: Status 404 returned error can't find the container with id eb266a60122f77af3f2325bb4669680b0a79ae340f94e35b90d489458d882c79 Apr 17 09:11:33.814140 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.814096 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:33.814331 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.814164 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:33.814331 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.814204 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:33.814331 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814260 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814335 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814263 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814368 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814341 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:34.814318447 +0000 UTC m=+35.533816491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814394 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:34.81438144 +0000 UTC m=+35.533879485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:33.814518 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.814423 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:34.814405407 +0000 UTC m=+35.533903459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:33.915259 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:33.915152 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:33.916090 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.915306 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:33.916090 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:33.915454 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:34.915430736 +0000 UTC m=+35.634928776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:34.035826 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.035779 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" event={"ID":"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3","Type":"ContainerStarted","Data":"30dabb092c2c72f09ba15ced678df2023ca32a33f390bd7235c083efcfd2acca"} Apr 17 09:11:34.036966 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.036934 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerStarted","Data":"2eb6cb72a947c14954630b2a6f1b362d3a164f341d59d13cdd40cb15a9d3b3ef"} Apr 17 09:11:34.038081 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.038053 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" event={"ID":"9223787d-c7eb-4652-8232-9e4d13e72e36","Type":"ContainerStarted","Data":"eb266a60122f77af3f2325bb4669680b0a79ae340f94e35b90d489458d882c79"} Apr 17 09:11:34.823799 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.823746 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:34.823997 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.823880 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:34.823997 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.823920 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:34.824116 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.823998 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:34.824116 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824079 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:36.824057069 +0000 UTC m=+37.543555109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:34.824116 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824092 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:34.824270 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824147 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:36.824129303 +0000 UTC m=+37.543627345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:34.824270 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824225 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:34.824270 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824237 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:34.824419 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.824271 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:36.824259957 +0000 UTC m=+37.543757995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:34.866633 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.866584 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:11:34.867116 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.867100 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:34.867565 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.867543 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:11:34.873896 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.873865 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:11:34.874025 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.873951 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7jx4l\"" Apr 17 09:11:34.874180 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.874137 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:11:34.874367 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.874306 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:11:34.874508 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.874370 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:11:34.874975 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.874659 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 09:11:34.925010 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:34.924970 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:34.925452 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.925189 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:34.925452 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:34.925269 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:36.925254728 +0000 UTC m=+37.644752782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:36.845047 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:36.844999 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:36.845073 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:36.845113 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845139 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845180 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845196 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845221 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.845199517 +0000 UTC m=+41.564697553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845245 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.845235598 +0000 UTC m=+41.564733635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845260 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:36.845515 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.845331 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.845311468 +0000 UTC m=+41.564809517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:36.945998 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:36.945949 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:36.946222 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.946122 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:36.946222 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:36.946199 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.94617782 +0000 UTC m=+41.665675860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:39.265034 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:39.264940 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:39.269011 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:39.268974 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0db5b5a-0bf0-459d-ba1b-46054d880831-original-pull-secret\") pod \"global-pull-secret-syncer-vm9k6\" (UID: \"c0db5b5a-0bf0-459d-ba1b-46054d880831\") " pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:39.399432 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:39.399391 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vm9k6" Apr 17 09:11:40.889141 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:40.889099 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:40.889152 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:40.889176 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889228 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889261 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889279 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889292 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889293 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.889277372 +0000 UTC m=+49.608775407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889337 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.88932251 +0000 UTC m=+49.608820545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:40.889545 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.889348 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.889342299 +0000 UTC m=+49.608840334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:40.989605 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:40.989561 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:40.989768 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.989720 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:40.989808 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:40.989795 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.989779377 +0000 UTC m=+49.709277416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:41.473718 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:41.473682 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vm9k6"] Apr 17 09:11:41.495185 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:11:41.494891 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0db5b5a_0bf0_459d_ba1b_46054d880831.slice/crio-8ad77efa37671187d6845bcb39f3ffc9111cd670b48029669c4813a0ebaecb79 WatchSource:0}: Error finding container 8ad77efa37671187d6845bcb39f3ffc9111cd670b48029669c4813a0ebaecb79: Status 404 returned error can't find the container with id 8ad77efa37671187d6845bcb39f3ffc9111cd670b48029669c4813a0ebaecb79 Apr 17 09:11:42.056344 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.056251 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerStarted","Data":"d56ecf42cbba126ba67088bb9872e5ae07577dfd5b4d9a6b49a98992d5aeaa23"} Apr 17 09:11:42.057285 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.057261 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vm9k6" event={"ID":"c0db5b5a-0bf0-459d-ba1b-46054d880831","Type":"ContainerStarted","Data":"8ad77efa37671187d6845bcb39f3ffc9111cd670b48029669c4813a0ebaecb79"} Apr 17 09:11:42.058519 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.058479 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" event={"ID":"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3","Type":"ContainerStarted","Data":"950a5565cf3c709d9105a40c4c052937e5c8c2bb08cdc0ab1f7c0537d5022ede"} Apr 17 09:11:42.061045 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.061018 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="81a252be94ffbe27b772524fc60c581ab2a24ce90853f54dfd6ceb5039cf1e58" exitCode=0 Apr 17 09:11:42.061158 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.061095 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"81a252be94ffbe27b772524fc60c581ab2a24ce90853f54dfd6ceb5039cf1e58"} Apr 17 09:11:42.062408 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.062388 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" event={"ID":"9223787d-c7eb-4652-8232-9e4d13e72e36","Type":"ContainerStarted","Data":"a665dc68c7648aa53754f49a23a3effda37851d16b9559f3f1c3c6590efe992b"} Apr 17 09:11:42.062768 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.062746 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:42.064584 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.064561 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:11:42.074028 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.073980 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" podStartSLOduration=22.370290492 podStartE2EDuration="30.073966538s" podCreationTimestamp="2026-04-17 09:11:12 +0000 UTC" firstStartedPulling="2026-04-17 09:11:33.644703751 +0000 UTC m=+34.364201786" lastFinishedPulling="2026-04-17 09:11:41.348379798 +0000 UTC m=+42.067877832" observedRunningTime="2026-04-17 09:11:42.073279438 +0000 UTC m=+42.792777498" watchObservedRunningTime="2026-04-17 09:11:42.073966538 +0000 UTC m=+42.793464586" Apr 17 09:11:42.087935 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:42.087841 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" podStartSLOduration=22.404359831 podStartE2EDuration="30.087824999s" podCreationTimestamp="2026-04-17 09:11:12 +0000 UTC" firstStartedPulling="2026-04-17 09:11:33.66241515 +0000 UTC m=+34.381913185" lastFinishedPulling="2026-04-17 09:11:41.345880302 +0000 UTC m=+42.065378353" observedRunningTime="2026-04-17 09:11:42.087644854 +0000 UTC m=+42.807142915" watchObservedRunningTime="2026-04-17 09:11:42.087824999 +0000 UTC m=+42.807323058" Apr 17 09:11:43.068666 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:43.068622 2606 generic.go:358] "Generic (PLEG): container finished" podID="07361f70-d7d2-4866-8e18-3bd88e02229e" containerID="91eadeafd89a2fb3a1ec50fa6731360b9898a9c24570e4c6ee99d775292b422d" exitCode=0 Apr 17 09:11:43.069096 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:43.068704 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerDied","Data":"91eadeafd89a2fb3a1ec50fa6731360b9898a9c24570e4c6ee99d775292b422d"} Apr 17 09:11:44.073277 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:44.073235 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" event={"ID":"07361f70-d7d2-4866-8e18-3bd88e02229e","Type":"ContainerStarted","Data":"68b0da029820b275612357a894cf22d99f7a75a80de455bc00cb456965fcb851"} Apr 17 09:11:44.111389 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:44.111335 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2jnqg" podStartSLOduration=6.3913331190000005 podStartE2EDuration="45.111319976s" podCreationTimestamp="2026-04-17 09:10:59 +0000 UTC" firstStartedPulling="2026-04-17 09:11:02.623042428 +0000 UTC m=+3.342540477" lastFinishedPulling="2026-04-17 09:11:41.343029287 +0000 UTC m=+42.062527334" observedRunningTime="2026-04-17 09:11:44.109840955 +0000 UTC m=+44.829339012" watchObservedRunningTime="2026-04-17 09:11:44.111319976 +0000 UTC m=+44.830818034" Apr 17 09:11:45.077874 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:45.077798 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerStarted","Data":"20f35152aa63b1e6a63bd528e960844e0de4f2b8411f2cd94c0277dfa926c28d"} Apr 17 09:11:45.077874 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:45.077847 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerStarted","Data":"c6ad8d7d75dd4bae81f47526bf33de5c5416d5c6b529ef52b873d3b824e517e8"} Apr 17 09:11:45.097538 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:45.097334 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" podStartSLOduration=22.695040979 podStartE2EDuration="33.097317915s" podCreationTimestamp="2026-04-17 09:11:12 +0000 UTC" firstStartedPulling="2026-04-17 09:11:33.650090942 +0000 UTC m=+34.369588987" lastFinishedPulling="2026-04-17 09:11:44.052367873 +0000 UTC m=+44.771865923" observedRunningTime="2026-04-17 09:11:45.096948001 +0000 UTC m=+45.816446060" watchObservedRunningTime="2026-04-17 09:11:45.097317915 +0000 UTC m=+45.816816001" Apr 17 09:11:47.084103 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:47.084023 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vm9k6" event={"ID":"c0db5b5a-0bf0-459d-ba1b-46054d880831","Type":"ContainerStarted","Data":"96a0eb56f83306e0db7b719a7746aeb3c45862d729058b72800ca5eb35df7f10"} Apr 17 09:11:47.104550 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:47.104468 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vm9k6" podStartSLOduration=35.256060517 podStartE2EDuration="40.104453244s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:41.496932216 +0000 UTC m=+42.216430257" lastFinishedPulling="2026-04-17 09:11:46.345324933 +0000 UTC m=+47.064822984" observedRunningTime="2026-04-17 09:11:47.103862656 +0000 UTC m=+47.823360715" watchObservedRunningTime="2026-04-17 09:11:47.104453244 +0000 UTC m=+47.823951369" Apr 17 09:11:48.958043 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:48.957995 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:11:48.958043 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:48.958050 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958143 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958145 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:48.958172 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958215 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:04.95819716 +0000 UTC m=+65.677695195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958158 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958250 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958293 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:12:04.958278243 +0000 UTC m=+65.677776280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:11:48.958586 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:48.958310 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:04.958303691 +0000 UTC m=+65.677801726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:11:49.058775 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:49.058735 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:11:49.058964 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:49.058883 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:49.058964 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:11:49.058952 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:05.058937588 +0000 UTC m=+65.778435623 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:11:58.032066 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:11:58.032031 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pczcn" Apr 17 09:12:04.989801 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:04.989741 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:12:04.989801 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:04.989808 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:04.989832 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.989905 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.989947 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.989959 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.989988 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:36.989968823 +0000 UTC m=+97.709466859 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.990004 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:12:36.98999782 +0000 UTC m=+97.709495854 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.989962 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:12:04.990242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:04.990084 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:36.990066205 +0000 UTC m=+97.709564253 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:12:05.090531 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.090471 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:12:05.090700 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:05.090625 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:05.090700 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:05.090699 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:37.090678528 +0000 UTC m=+97.810176563 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:12:05.594663 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.594620 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:12:05.597528 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.597482 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:12:05.605874 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:05.605848 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:12:05.605956 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:05.605915 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:13:09.605898412 +0000 UTC m=+130.325396447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : secret "metrics-daemon-secret" not found Apr 17 09:12:05.695679 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.695636 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:12:05.698571 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.698548 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:12:05.709104 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.709081 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:12:05.719605 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.719584 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9s4\" (UniqueName: \"kubernetes.io/projected/252902a0-d0ed-496b-bcbb-6dc20ec3c9d4-kube-api-access-xz9s4\") pod \"network-check-target-7l5df\" (UID: \"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4\") " pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:12:05.810807 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.810771 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7jx4l\"" Apr 17 09:12:05.818038 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.818008 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:12:05.939163 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:05.939126 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7l5df"] Apr 17 09:12:05.942323 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:12:05.942292 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252902a0_d0ed_496b_bcbb_6dc20ec3c9d4.slice/crio-83cab6c49d21ff776811d221e5fa1309bf4e8da35792f8ef0ba5137d89b2cee3 WatchSource:0}: Error finding container 83cab6c49d21ff776811d221e5fa1309bf4e8da35792f8ef0ba5137d89b2cee3: Status 404 returned error can't find the container with id 83cab6c49d21ff776811d221e5fa1309bf4e8da35792f8ef0ba5137d89b2cee3 Apr 17 09:12:06.130745 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:06.130654 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7l5df" event={"ID":"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4","Type":"ContainerStarted","Data":"83cab6c49d21ff776811d221e5fa1309bf4e8da35792f8ef0ba5137d89b2cee3"} Apr 17 09:12:09.140332 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:09.140293 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7l5df" event={"ID":"252902a0-d0ed-496b-bcbb-6dc20ec3c9d4","Type":"ContainerStarted","Data":"1e4513ddb1ca1f0b5a1a8f8c5f6bdd0421ef76f7f4e9a4be67acdcefe3dbc440"} Apr 17 09:12:10.142688 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:10.142658 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:12:10.161333 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:10.161276 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7l5df" podStartSLOduration=68.098175662 podStartE2EDuration="1m11.161260114s" podCreationTimestamp="2026-04-17 09:10:59 +0000 UTC" firstStartedPulling="2026-04-17 09:12:05.944223784 +0000 UTC m=+66.663721825" lastFinishedPulling="2026-04-17 09:12:09.007308239 +0000 UTC m=+69.726806277" observedRunningTime="2026-04-17 09:12:10.160142137 +0000 UTC m=+70.879640218" watchObservedRunningTime="2026-04-17 09:12:10.161260114 +0000 UTC m=+70.880758170" Apr 17 09:12:37.043345 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:37.043292 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:37.043440 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043453 2606 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:37.043476 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043568 2606 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043587 2606 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-569565b78d-fkh9d: secret "image-registry-tls" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043573 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert podName:8381b0dc-c82c-4102-aa7e-97c8fec5ffc8 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:41.043551627 +0000 UTC m=+161.763049679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-m7rcj" (UID: "8381b0dc-c82c-4102-aa7e-97c8fec5ffc8") : secret "networking-console-plugin-cert" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043620 2606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043640 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls podName:d997507b-9e3c-4a7f-bb90-5f4b719b52c2 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:41.043628384 +0000 UTC m=+161.763126423 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls") pod "image-registry-569565b78d-fkh9d" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2") : secret "image-registry-tls" not found Apr 17 09:12:37.043819 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.043686 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls podName:cb63c154-05d7-4e31-bcd5-6b2da69d604c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:41.043669852 +0000 UTC m=+161.763167907 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls") pod "dns-default-6c4sw" (UID: "cb63c154-05d7-4e31-bcd5-6b2da69d604c") : secret "dns-default-metrics-tls" not found Apr 17 09:12:37.144206 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:37.144110 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:12:37.147506 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.144607 2606 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:37.147506 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:12:37.144705 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert podName:2e6cf676-b554-4b97-99de-d1ff810ef911 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:41.144682289 +0000 UTC m=+161.864180344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert") pod "ingress-canary-snccl" (UID: "2e6cf676-b554-4b97-99de-d1ff810ef911") : secret "canary-serving-cert" not found Apr 17 09:12:41.147396 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:12:41.147361 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7l5df" Apr 17 09:13:09.684863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:09.684802 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:13:09.685295 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:09.684958 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:13:09.685295 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:09.685039 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs podName:ef0a40f6-04de-4672-9770-be487916c08b nodeName:}" failed. No retries permitted until 2026-04-17 09:15:11.685020754 +0000 UTC m=+252.404518793 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs") pod "network-metrics-daemon-2w27v" (UID: "ef0a40f6-04de-4672-9770-be487916c08b") : secret "metrics-daemon-secret" not found Apr 17 09:13:19.284206 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:19.284180 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wxmnk_0a1e21cd-b620-41c1-9c1d-4e6fba3925ef/dns-node-resolver/0.log" Apr 17 09:13:19.684048 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:19.684019 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g4rrd_e4f475ef-a8f7-4778-86ea-42db013683b6/node-ca/0.log" Apr 17 09:13:36.129564 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:36.129518 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" podUID="8381b0dc-c82c-4102-aa7e-97c8fec5ffc8" Apr 17 09:13:36.168734 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:36.168688 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" Apr 17 09:13:36.183023 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:36.182988 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6c4sw" podUID="cb63c154-05d7-4e31-bcd5-6b2da69d604c" Apr 17 09:13:36.234660 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:36.234607 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-snccl" podUID="2e6cf676-b554-4b97-99de-d1ff810ef911" Apr 17 09:13:36.346157 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:36.346134 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:13:36.346283 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:36.346157 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:36.346283 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:36.346134 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:13:36.346283 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:36.346134 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:13:37.889242 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:37.889192 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2w27v" podUID="ef0a40f6-04de-4672-9770-be487916c08b" Apr 17 09:13:41.135219 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.135177 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:13:41.135620 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.135228 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:13:41.135620 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.135258 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:41.137643 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.137618 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb63c154-05d7-4e31-bcd5-6b2da69d604c-metrics-tls\") pod \"dns-default-6c4sw\" (UID: \"cb63c154-05d7-4e31-bcd5-6b2da69d604c\") " pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:41.137767 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.137737 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8381b0dc-c82c-4102-aa7e-97c8fec5ffc8-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-m7rcj\" (UID: \"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:13:41.137825 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.137796 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"image-registry-569565b78d-fkh9d\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:13:41.151099 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.151075 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vngck\"" Apr 17 09:13:41.151099 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.151077 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9xfs\"" Apr 17 09:13:41.151272 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.151080 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:13:41.158237 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.158217 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:13:41.158322 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.158238 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:41.158322 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.158272 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" Apr 17 09:13:41.236236 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.235672 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:13:41.239920 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.238840 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e6cf676-b554-4b97-99de-d1ff810ef911-cert\") pod \"ingress-canary-snccl\" (UID: \"2e6cf676-b554-4b97-99de-d1ff810ef911\") " pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:13:41.306904 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.306872 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:13:41.310050 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:41.310020 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd997507b_9e3c_4a7f_bb90_5f4b719b52c2.slice/crio-6372a92101b05788e663572bf73d5f94c70b74df5b159a39d5fca5e68c28714d WatchSource:0}: Error finding container 6372a92101b05788e663572bf73d5f94c70b74df5b159a39d5fca5e68c28714d: Status 404 returned error can't find the container with id 6372a92101b05788e663572bf73d5f94c70b74df5b159a39d5fca5e68c28714d Apr 17 09:13:41.359629 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.359594 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" event={"ID":"d997507b-9e3c-4a7f-bb90-5f4b719b52c2","Type":"ContainerStarted","Data":"6372a92101b05788e663572bf73d5f94c70b74df5b159a39d5fca5e68c28714d"} Apr 17 09:13:41.450258 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.450173 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:13:41.457566 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.457544 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-snccl" Apr 17 09:13:41.521796 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.521738 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj"] Apr 17 09:13:41.527304 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.527258 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6c4sw"] Apr 17 09:13:41.527933 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:41.527885 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8381b0dc_c82c_4102_aa7e_97c8fec5ffc8.slice/crio-08fa7527541c16802dc81f48aae914af52e2dffc13e7d6afb513b676493cecf2 WatchSource:0}: Error finding container 08fa7527541c16802dc81f48aae914af52e2dffc13e7d6afb513b676493cecf2: Status 404 returned error can't find the container with id 08fa7527541c16802dc81f48aae914af52e2dffc13e7d6afb513b676493cecf2 Apr 17 09:13:41.531561 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:41.531484 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb63c154_05d7_4e31_bcd5_6b2da69d604c.slice/crio-b1acdf431691b014c5538776f300ab8bde991ce6ae08fbd79fd3da4a77cd3e48 WatchSource:0}: Error finding container b1acdf431691b014c5538776f300ab8bde991ce6ae08fbd79fd3da4a77cd3e48: Status 404 returned error can't find the container with id b1acdf431691b014c5538776f300ab8bde991ce6ae08fbd79fd3da4a77cd3e48 Apr 17 09:13:41.585742 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:41.585705 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-snccl"] Apr 17 09:13:41.595405 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:41.595371 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6cf676_b554_4b97_99de_d1ff810ef911.slice/crio-cc9ab0b56679aad749392eed7f7f93e500f8bbaf5f47829e7595a29807acf2c9 WatchSource:0}: Error finding container cc9ab0b56679aad749392eed7f7f93e500f8bbaf5f47829e7595a29807acf2c9: Status 404 returned error can't find the container with id cc9ab0b56679aad749392eed7f7f93e500f8bbaf5f47829e7595a29807acf2c9 Apr 17 09:13:42.063670 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.063600 2606 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" podUID="9223787d-c7eb-4652-8232-9e4d13e72e36" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 17 09:13:42.364782 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.364743 2606 generic.go:358] "Generic (PLEG): container finished" podID="2bce8a2f-a74a-4132-9eb3-8da275ed9ba3" containerID="950a5565cf3c709d9105a40c4c052937e5c8c2bb08cdc0ab1f7c0537d5022ede" exitCode=255 Apr 17 09:13:42.365226 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.364827 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" event={"ID":"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3","Type":"ContainerDied","Data":"950a5565cf3c709d9105a40c4c052937e5c8c2bb08cdc0ab1f7c0537d5022ede"} Apr 17 09:13:42.365271 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.365236 2606 scope.go:117] "RemoveContainer" containerID="950a5565cf3c709d9105a40c4c052937e5c8c2bb08cdc0ab1f7c0537d5022ede" Apr 17 09:13:42.366769 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.366734 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" event={"ID":"d997507b-9e3c-4a7f-bb90-5f4b719b52c2","Type":"ContainerStarted","Data":"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e"} Apr 17 09:13:42.366976 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.366960 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:13:42.368356 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.368308 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-snccl" event={"ID":"2e6cf676-b554-4b97-99de-d1ff810ef911","Type":"ContainerStarted","Data":"cc9ab0b56679aad749392eed7f7f93e500f8bbaf5f47829e7595a29807acf2c9"} Apr 17 09:13:42.369743 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.369715 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" event={"ID":"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8","Type":"ContainerStarted","Data":"08fa7527541c16802dc81f48aae914af52e2dffc13e7d6afb513b676493cecf2"} Apr 17 09:13:42.372175 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.372153 2606 generic.go:358] "Generic (PLEG): container finished" podID="9223787d-c7eb-4652-8232-9e4d13e72e36" containerID="a665dc68c7648aa53754f49a23a3effda37851d16b9559f3f1c3c6590efe992b" exitCode=1 Apr 17 09:13:42.372290 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.372221 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" event={"ID":"9223787d-c7eb-4652-8232-9e4d13e72e36","Type":"ContainerDied","Data":"a665dc68c7648aa53754f49a23a3effda37851d16b9559f3f1c3c6590efe992b"} Apr 17 09:13:42.372595 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.372575 2606 scope.go:117] "RemoveContainer" containerID="a665dc68c7648aa53754f49a23a3effda37851d16b9559f3f1c3c6590efe992b" Apr 17 09:13:42.373395 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.373366 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6c4sw" event={"ID":"cb63c154-05d7-4e31-bcd5-6b2da69d604c","Type":"ContainerStarted","Data":"b1acdf431691b014c5538776f300ab8bde991ce6ae08fbd79fd3da4a77cd3e48"} Apr 17 09:13:42.400090 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:42.400036 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" podStartSLOduration=162.400016971 podStartE2EDuration="2m42.400016971s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:42.398520392 +0000 UTC m=+163.118018484" watchObservedRunningTime="2026-04-17 09:13:42.400016971 +0000 UTC m=+163.119515029" Apr 17 09:13:43.377345 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.377303 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" event={"ID":"8381b0dc-c82c-4102-aa7e-97c8fec5ffc8","Type":"ContainerStarted","Data":"6377827358b84576f01290ab380f8f4e755d8291b835014065c89ef025b08d3a"} Apr 17 09:13:43.378976 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.378945 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" event={"ID":"9223787d-c7eb-4652-8232-9e4d13e72e36","Type":"ContainerStarted","Data":"1c436b6a39f223d2b30098b8dc14f87e7da8327bd74dee2934ca6c1478a5f0c3"} Apr 17 09:13:43.379276 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.379239 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:13:43.379884 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.379853 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55c588f6cb-dbp8j" Apr 17 09:13:43.380769 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.380749 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58544c9c65-x6c6c" event={"ID":"2bce8a2f-a74a-4132-9eb3-8da275ed9ba3","Type":"ContainerStarted","Data":"5eaf326e78bf43782194f196de3660bfe2faa1963819f011d0b00ca66784a6a1"} Apr 17 09:13:43.395138 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:43.395090 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-m7rcj" podStartSLOduration=162.109053502 podStartE2EDuration="2m43.395073684s" podCreationTimestamp="2026-04-17 09:11:00 +0000 UTC" firstStartedPulling="2026-04-17 09:13:41.530256237 +0000 UTC m=+162.249754272" lastFinishedPulling="2026-04-17 09:13:42.816276412 +0000 UTC m=+163.535774454" observedRunningTime="2026-04-17 09:13:43.39457885 +0000 UTC m=+164.114076907" watchObservedRunningTime="2026-04-17 09:13:43.395073684 +0000 UTC m=+164.114571743" Apr 17 09:13:44.384408 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.384369 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-snccl" event={"ID":"2e6cf676-b554-4b97-99de-d1ff810ef911","Type":"ContainerStarted","Data":"e5639582205983c92f65451f1ec465fd636aeeff6b81a2758abeb01bbde9d41d"} Apr 17 09:13:44.386001 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.385968 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6c4sw" event={"ID":"cb63c154-05d7-4e31-bcd5-6b2da69d604c","Type":"ContainerStarted","Data":"ffaba78c9a25985735856cf7fdd075fc792d69674559dde8dda264dcc326d9ab"} Apr 17 09:13:44.386001 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.386003 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6c4sw" event={"ID":"cb63c154-05d7-4e31-bcd5-6b2da69d604c","Type":"ContainerStarted","Data":"7306bd3c7be05b161bc3383bbffae9dab424e469e9a2b34fc6b503ca235bb0cc"} Apr 17 09:13:44.386175 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.386053 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:44.400543 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.400473 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-snccl" podStartSLOduration=129.320198118 podStartE2EDuration="2m11.400459902s" podCreationTimestamp="2026-04-17 09:11:33 +0000 UTC" firstStartedPulling="2026-04-17 09:13:41.597216083 +0000 UTC m=+162.316714118" lastFinishedPulling="2026-04-17 09:13:43.677477864 +0000 UTC m=+164.396975902" observedRunningTime="2026-04-17 09:13:44.400445515 +0000 UTC m=+165.119943598" watchObservedRunningTime="2026-04-17 09:13:44.400459902 +0000 UTC m=+165.119957950" Apr 17 09:13:44.419335 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.419285 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6c4sw" podStartSLOduration=129.275819436 podStartE2EDuration="2m11.419271512s" podCreationTimestamp="2026-04-17 09:11:33 +0000 UTC" firstStartedPulling="2026-04-17 09:13:41.533859387 +0000 UTC m=+162.253357421" lastFinishedPulling="2026-04-17 09:13:43.677311459 +0000 UTC m=+164.396809497" observedRunningTime="2026-04-17 09:13:44.418604623 +0000 UTC m=+165.138102678" watchObservedRunningTime="2026-04-17 09:13:44.419271512 +0000 UTC m=+165.138769567" Apr 17 09:13:44.664052 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.663959 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-88rzd"] Apr 17 09:13:44.666168 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.666142 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.668920 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.668887 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 09:13:44.669069 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.668887 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 09:13:44.669069 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.668888 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 09:13:44.669903 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.669884 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 09:13:44.670015 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.669886 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f5jjb\"" Apr 17 09:13:44.683456 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.683430 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-88rzd"] Apr 17 09:13:44.764054 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.764013 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ead68b2-4f68-4605-9c1e-c4f73c735c33-crio-socket\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.764237 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.764083 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.764237 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.764118 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9hm\" (UniqueName: \"kubernetes.io/projected/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-api-access-zd9hm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.764237 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.764199 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ead68b2-4f68-4605-9c1e-c4f73c735c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.764237 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.764231 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ead68b2-4f68-4605-9c1e-c4f73c735c33-data-volume\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865390 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865356 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ead68b2-4f68-4605-9c1e-c4f73c735c33-crio-socket\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865390 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865404 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865641 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865428 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9hm\" (UniqueName: \"kubernetes.io/projected/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-api-access-zd9hm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865641 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865474 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ead68b2-4f68-4605-9c1e-c4f73c735c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865641 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865522 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ead68b2-4f68-4605-9c1e-c4f73c735c33-data-volume\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865641 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865525 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4ead68b2-4f68-4605-9c1e-c4f73c735c33-crio-socket\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865906 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865885 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4ead68b2-4f68-4605-9c1e-c4f73c735c33-data-volume\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.865977 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.865960 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.867864 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.867848 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4ead68b2-4f68-4605-9c1e-c4f73c735c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.874224 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.874200 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9hm\" (UniqueName: \"kubernetes.io/projected/4ead68b2-4f68-4605-9c1e-c4f73c735c33-kube-api-access-zd9hm\") pod \"insights-runtime-extractor-88rzd\" (UID: \"4ead68b2-4f68-4605-9c1e-c4f73c735c33\") " pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:44.975662 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:44.975578 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-88rzd" Apr 17 09:13:45.093878 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:45.093837 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-88rzd"] Apr 17 09:13:45.097135 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:45.097082 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ead68b2_4f68_4605_9c1e_c4f73c735c33.slice/crio-95bb53a519467fc2ba570360e72874313090c25836c05129b6976a3eb6166bfa WatchSource:0}: Error finding container 95bb53a519467fc2ba570360e72874313090c25836c05129b6976a3eb6166bfa: Status 404 returned error can't find the container with id 95bb53a519467fc2ba570360e72874313090c25836c05129b6976a3eb6166bfa Apr 17 09:13:45.389970 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:45.389929 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88rzd" event={"ID":"4ead68b2-4f68-4605-9c1e-c4f73c735c33","Type":"ContainerStarted","Data":"99b9569636e27b1396b40a654d740e828adc28028b734d158495f594b23221c6"} Apr 17 09:13:45.389970 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:45.389976 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88rzd" event={"ID":"4ead68b2-4f68-4605-9c1e-c4f73c735c33","Type":"ContainerStarted","Data":"95bb53a519467fc2ba570360e72874313090c25836c05129b6976a3eb6166bfa"} Apr 17 09:13:46.395945 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:46.395891 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88rzd" event={"ID":"4ead68b2-4f68-4605-9c1e-c4f73c735c33","Type":"ContainerStarted","Data":"9abc1abfbe6c80ce619419e47584551e3f2f03628d7f956ff0afabba96bde3d2"} Apr 17 09:13:47.400107 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:47.400073 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-88rzd" event={"ID":"4ead68b2-4f68-4605-9c1e-c4f73c735c33","Type":"ContainerStarted","Data":"c45eb9d881f774adf7f88b0b9989fee2f197da8eb8820015d925c3ff07e32d53"} Apr 17 09:13:47.418950 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:47.418897 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-88rzd" podStartSLOduration=1.25681795 podStartE2EDuration="3.418882496s" podCreationTimestamp="2026-04-17 09:13:44 +0000 UTC" firstStartedPulling="2026-04-17 09:13:45.160852626 +0000 UTC m=+165.880350661" lastFinishedPulling="2026-04-17 09:13:47.322917173 +0000 UTC m=+168.042415207" observedRunningTime="2026-04-17 09:13:47.418783888 +0000 UTC m=+168.138281936" watchObservedRunningTime="2026-04-17 09:13:47.418882496 +0000 UTC m=+168.138380551" Apr 17 09:13:50.866032 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:50.865934 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:13:54.392024 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:54.391992 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6c4sw" Apr 17 09:13:56.277863 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.277826 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h5dt8"] Apr 17 09:13:56.281153 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.281113 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.283899 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.283861 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 09:13:56.284051 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.283968 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xkdtg\"" Apr 17 09:13:56.284225 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.284212 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 09:13:56.284225 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.284221 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 09:13:56.284998 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.284979 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 09:13:56.285113 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.285098 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 09:13:56.285217 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.285203 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 09:13:56.334401 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334369 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-wtmp\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334401 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334401 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334653 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334421 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-metrics-client-ca\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334653 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334439 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nmt\" (UniqueName: \"kubernetes.io/projected/16c665c3-2369-4f8c-ad19-990742a98173-kube-api-access-h2nmt\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334653 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334522 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-root\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334653 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334570 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334653 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334630 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334809 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334680 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-textfile\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.334809 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.334729 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-sys\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435176 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435146 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-root\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435352 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435200 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435352 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435227 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435352 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435251 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-textfile\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435352 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435261 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-root\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435352 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435329 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-sys\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:56.435367 2606 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435372 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-wtmp\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435405 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:13:56.435440 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls podName:16c665c3-2369-4f8c-ad19-990742a98173 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:56.935418723 +0000 UTC m=+177.654916757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls") pod "node-exporter-h5dt8" (UID: "16c665c3-2369-4f8c-ad19-990742a98173") : secret "node-exporter-tls" not found Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435447 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-sys\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435472 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-metrics-client-ca\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435526 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2nmt\" (UniqueName: \"kubernetes.io/projected/16c665c3-2369-4f8c-ad19-990742a98173-kube-api-access-h2nmt\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.435576 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.435565 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-wtmp\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.436342 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.436321 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-textfile\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.436581 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.436550 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.436581 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.436566 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16c665c3-2369-4f8c-ad19-990742a98173-metrics-client-ca\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.438337 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.438320 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.444665 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.444636 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2nmt\" (UniqueName: \"kubernetes.io/projected/16c665c3-2369-4f8c-ad19-990742a98173-kube-api-access-h2nmt\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.937442 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.937388 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:56.939669 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:56.939646 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/16c665c3-2369-4f8c-ad19-990742a98173-node-exporter-tls\") pod \"node-exporter-h5dt8\" (UID: \"16c665c3-2369-4f8c-ad19-990742a98173\") " pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:57.190588 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:57.190474 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5dt8" Apr 17 09:13:57.198553 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:13:57.198527 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c665c3_2369_4f8c_ad19_990742a98173.slice/crio-db51b2e754247b8bda8cfccfb5c9333e7d6dc2f3d2edd07ae19d9132cde5b31c WatchSource:0}: Error finding container db51b2e754247b8bda8cfccfb5c9333e7d6dc2f3d2edd07ae19d9132cde5b31c: Status 404 returned error can't find the container with id db51b2e754247b8bda8cfccfb5c9333e7d6dc2f3d2edd07ae19d9132cde5b31c Apr 17 09:13:57.426968 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:57.426934 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5dt8" event={"ID":"16c665c3-2369-4f8c-ad19-990742a98173","Type":"ContainerStarted","Data":"db51b2e754247b8bda8cfccfb5c9333e7d6dc2f3d2edd07ae19d9132cde5b31c"} Apr 17 09:13:58.430663 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:58.430577 2606 generic.go:358] "Generic (PLEG): container finished" podID="16c665c3-2369-4f8c-ad19-990742a98173" containerID="7fdfede2903dc9c750b674f88385402735fdbe2bc0779aa6d5e2aaa4207157bd" exitCode=0 Apr 17 09:13:58.431028 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:58.430669 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5dt8" event={"ID":"16c665c3-2369-4f8c-ad19-990742a98173","Type":"ContainerDied","Data":"7fdfede2903dc9c750b674f88385402735fdbe2bc0779aa6d5e2aaa4207157bd"} Apr 17 09:13:59.435206 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:59.435170 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5dt8" event={"ID":"16c665c3-2369-4f8c-ad19-990742a98173","Type":"ContainerStarted","Data":"8f23e1c8e8f9778ac25806c2973c511605e6bbd57a6246068c91e41b434abd96"} Apr 17 09:13:59.435206 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:59.435211 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5dt8" event={"ID":"16c665c3-2369-4f8c-ad19-990742a98173","Type":"ContainerStarted","Data":"9fec338a931ff964d67cf541224daf9a32762ce10c522bba8fc761ff6f5625d2"} Apr 17 09:13:59.456233 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:13:59.456187 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h5dt8" podStartSLOduration=2.52124909 podStartE2EDuration="3.456172558s" podCreationTimestamp="2026-04-17 09:13:56 +0000 UTC" firstStartedPulling="2026-04-17 09:13:57.200392271 +0000 UTC m=+177.919890320" lastFinishedPulling="2026-04-17 09:13:58.135315714 +0000 UTC m=+178.854813788" observedRunningTime="2026-04-17 09:13:59.454155529 +0000 UTC m=+180.173653585" watchObservedRunningTime="2026-04-17 09:13:59.456172558 +0000 UTC m=+180.175670614" Apr 17 09:14:01.162512 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:01.162457 2606 patch_prober.go:28] interesting pod/image-registry-569565b78d-fkh9d container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 09:14:01.162879 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:01.162527 2606 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 09:14:03.384442 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:03.384411 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:14:06.726165 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:06.726127 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:14:23.448022 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:23.447982 2606 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" podUID="11812f01-f01e-4a9f-a011-51e2ede8fbc2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 09:14:31.744767 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:31.744715 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerName="registry" containerID="cri-o://d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e" gracePeriod=30 Apr 17 09:14:32.004400 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.004337 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:14:32.096829 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096781 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097008 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096840 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097008 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096868 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097008 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096916 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097008 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096945 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097008 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.096982 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097250 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.097020 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097250 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.097045 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpjhm\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm\") pod \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\" (UID: \"d997507b-9e3c-4a7f-bb90-5f4b719b52c2\") " Apr 17 09:14:32.097348 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.097280 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:14:32.097401 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.097376 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:14:32.099478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.099447 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:14:32.099478 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.099461 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:32.099650 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.099613 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm" (OuterVolumeSpecName: "kube-api-access-hpjhm") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "kube-api-access-hpjhm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:32.099650 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.099618 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:32.099816 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.099798 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:14:32.105629 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.105601 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d997507b-9e3c-4a7f-bb90-5f4b719b52c2" (UID: "d997507b-9e3c-4a7f-bb90-5f4b719b52c2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:14:32.198366 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198316 2606 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-certificates\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198366 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198363 2606 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-trusted-ca\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198366 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198373 2606 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-installation-pull-secrets\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198366 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198383 2606 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-registry-tls\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198631 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198391 2606 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-ca-trust-extracted\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198631 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198401 2606 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-image-registry-private-configuration\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198631 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198410 2606 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-bound-sa-token\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.198631 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.198419 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpjhm\" (UniqueName: \"kubernetes.io/projected/d997507b-9e3c-4a7f-bb90-5f4b719b52c2-kube-api-access-hpjhm\") on node \"ip-10-0-143-18.ec2.internal\" DevicePath \"\"" Apr 17 09:14:32.521554 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.521516 2606 generic.go:358] "Generic (PLEG): container finished" podID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerID="d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e" exitCode=0 Apr 17 09:14:32.521717 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.521575 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" event={"ID":"d997507b-9e3c-4a7f-bb90-5f4b719b52c2","Type":"ContainerDied","Data":"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e"} Apr 17 09:14:32.521717 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.521603 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" event={"ID":"d997507b-9e3c-4a7f-bb90-5f4b719b52c2","Type":"ContainerDied","Data":"6372a92101b05788e663572bf73d5f94c70b74df5b159a39d5fca5e68c28714d"} Apr 17 09:14:32.521717 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.521604 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-569565b78d-fkh9d" Apr 17 09:14:32.521717 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.521618 2606 scope.go:117] "RemoveContainer" containerID="d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e" Apr 17 09:14:32.530068 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.530049 2606 scope.go:117] "RemoveContainer" containerID="d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e" Apr 17 09:14:32.530346 ip-10-0-143-18 kubenswrapper[2606]: E0417 09:14:32.530326 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e\": container with ID starting with d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e not found: ID does not exist" containerID="d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e" Apr 17 09:14:32.530401 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.530354 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e"} err="failed to get container status \"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e\": rpc error: code = NotFound desc = could not find container \"d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e\": container with ID starting with d1b31d200a2ae8933499ac030da6492be7884d114dae3f817dfcc3e44706f94e not found: ID does not exist" Apr 17 09:14:32.545615 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.545586 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:14:32.551984 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:32.551942 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-569565b78d-fkh9d"] Apr 17 09:14:33.447268 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:33.447226 2606 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" podUID="11812f01-f01e-4a9f-a011-51e2ede8fbc2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 09:14:33.869616 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:33.869575 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" path="/var/lib/kubelet/pods/d997507b-9e3c-4a7f-bb90-5f4b719b52c2/volumes" Apr 17 09:14:43.447108 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:43.447067 2606 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" podUID="11812f01-f01e-4a9f-a011-51e2ede8fbc2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 09:14:43.447483 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:43.447135 2606 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" Apr 17 09:14:43.447605 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:43.447587 2606 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"20f35152aa63b1e6a63bd528e960844e0de4f2b8411f2cd94c0277dfa926c28d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 09:14:43.447641 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:43.447623 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" podUID="11812f01-f01e-4a9f-a011-51e2ede8fbc2" containerName="service-proxy" containerID="cri-o://20f35152aa63b1e6a63bd528e960844e0de4f2b8411f2cd94c0277dfa926c28d" gracePeriod=30 Apr 17 09:14:44.554258 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:44.554224 2606 generic.go:358] "Generic (PLEG): container finished" podID="11812f01-f01e-4a9f-a011-51e2ede8fbc2" containerID="20f35152aa63b1e6a63bd528e960844e0de4f2b8411f2cd94c0277dfa926c28d" exitCode=2 Apr 17 09:14:44.554649 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:44.554292 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerDied","Data":"20f35152aa63b1e6a63bd528e960844e0de4f2b8411f2cd94c0277dfa926c28d"} Apr 17 09:14:44.554649 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:14:44.554328 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-75d58b577-r96wl" event={"ID":"11812f01-f01e-4a9f-a011-51e2ede8fbc2","Type":"ContainerStarted","Data":"4ea05bef1a01e9796cc2ed72c508f2336ffa03cc9de2c3fc8973b3707d180899"} Apr 17 09:15:11.713426 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:11.713377 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:15:11.715825 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:11.715798 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0a40f6-04de-4672-9770-be487916c08b-metrics-certs\") pod \"network-metrics-daemon-2w27v\" (UID: \"ef0a40f6-04de-4672-9770-be487916c08b\") " pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:15:11.870365 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:11.870332 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:15:11.877820 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:11.877788 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2w27v" Apr 17 09:15:11.999027 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:11.998953 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2w27v"] Apr 17 09:15:12.002297 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:15:12.002268 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0a40f6_04de_4672_9770_be487916c08b.slice/crio-6c9441d9c235f42149549935bdb8f829bc9db61861ec031ed35fb4d1a7449166 WatchSource:0}: Error finding container 6c9441d9c235f42149549935bdb8f829bc9db61861ec031ed35fb4d1a7449166: Status 404 returned error can't find the container with id 6c9441d9c235f42149549935bdb8f829bc9db61861ec031ed35fb4d1a7449166 Apr 17 09:15:12.627117 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:12.627076 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2w27v" event={"ID":"ef0a40f6-04de-4672-9770-be487916c08b","Type":"ContainerStarted","Data":"6c9441d9c235f42149549935bdb8f829bc9db61861ec031ed35fb4d1a7449166"} Apr 17 09:15:13.631240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:13.631202 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2w27v" event={"ID":"ef0a40f6-04de-4672-9770-be487916c08b","Type":"ContainerStarted","Data":"aa47167371adf48a3a4e4798780e1ed4b6f5531b2d044c3dda985ce686f95b0e"} Apr 17 09:15:13.631240 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:13.631244 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2w27v" event={"ID":"ef0a40f6-04de-4672-9770-be487916c08b","Type":"ContainerStarted","Data":"40c3b35404dc63dd91230946111058a6332c8fa8bd4c07f950254b84ac5c4380"} Apr 17 09:15:13.648273 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:13.648215 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2w27v" podStartSLOduration=253.368828391 podStartE2EDuration="4m14.648200194s" podCreationTimestamp="2026-04-17 09:10:59 +0000 UTC" firstStartedPulling="2026-04-17 09:15:12.00401225 +0000 UTC m=+252.723510285" lastFinishedPulling="2026-04-17 09:15:13.283384037 +0000 UTC m=+254.002882088" observedRunningTime="2026-04-17 09:15:13.64696143 +0000 UTC m=+254.366459496" watchObservedRunningTime="2026-04-17 09:15:13.648200194 +0000 UTC m=+254.367698250" Apr 17 09:15:59.757759 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:59.757714 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:15:59.758333 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:59.757894 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:15:59.765200 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:15:59.765176 2606 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 09:18:38.017739 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.017701 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-r98dc"] Apr 17 09:18:38.018142 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.017934 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerName="registry" Apr 17 09:18:38.018142 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.017960 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerName="registry" Apr 17 09:18:38.018142 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.018008 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="d997507b-9e3c-4a7f-bb90-5f4b719b52c2" containerName="registry" Apr 17 09:18:38.020643 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.020626 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.023021 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.022996 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-jc74m\"" Apr 17 09:18:38.023349 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.023320 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:18:38.024053 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.024035 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 09:18:38.029444 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.029423 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-r98dc"] Apr 17 09:18:38.145894 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.145852 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84142610-6876-4e79-a209-dd2b4391f92b-tmp\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.146077 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.145917 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhhn\" (UniqueName: \"kubernetes.io/projected/84142610-6876-4e79-a209-dd2b4391f92b-kube-api-access-hlhhn\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.246774 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.246736 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84142610-6876-4e79-a209-dd2b4391f92b-tmp\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.246951 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.246793 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhhn\" (UniqueName: \"kubernetes.io/projected/84142610-6876-4e79-a209-dd2b4391f92b-kube-api-access-hlhhn\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.247139 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.247115 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84142610-6876-4e79-a209-dd2b4391f92b-tmp\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.255279 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.255244 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhhn\" (UniqueName: \"kubernetes.io/projected/84142610-6876-4e79-a209-dd2b4391f92b-kube-api-access-hlhhn\") pod \"jobset-operator-747c5859c7-r98dc\" (UID: \"84142610-6876-4e79-a209-dd2b4391f92b\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.330340 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.330229 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" Apr 17 09:18:38.452548 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.452513 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-r98dc"] Apr 17 09:18:38.455994 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:18:38.455962 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84142610_6876_4e79_a209_dd2b4391f92b.slice/crio-86d8cece2d728f1eacdb92e47fcbd129fe60b55dd9642fe3dbc5c547d310ddde WatchSource:0}: Error finding container 86d8cece2d728f1eacdb92e47fcbd129fe60b55dd9642fe3dbc5c547d310ddde: Status 404 returned error can't find the container with id 86d8cece2d728f1eacdb92e47fcbd129fe60b55dd9642fe3dbc5c547d310ddde Apr 17 09:18:38.457814 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:38.457796 2606 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:18:39.160646 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:39.160604 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" event={"ID":"84142610-6876-4e79-a209-dd2b4391f92b","Type":"ContainerStarted","Data":"86d8cece2d728f1eacdb92e47fcbd129fe60b55dd9642fe3dbc5c547d310ddde"} Apr 17 09:18:41.168113 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:41.168027 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" event={"ID":"84142610-6876-4e79-a209-dd2b4391f92b","Type":"ContainerStarted","Data":"4a2c9cba4b623cc8d3dcbe448fe4071b8567a323c7a64d196c0b5d5efd5691dd"} Apr 17 09:18:41.186509 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:18:41.186430 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-r98dc" podStartSLOduration=0.741623069 podStartE2EDuration="3.186411463s" podCreationTimestamp="2026-04-17 09:18:38 +0000 UTC" firstStartedPulling="2026-04-17 09:18:38.457918489 +0000 UTC m=+459.177416525" lastFinishedPulling="2026-04-17 09:18:40.90270688 +0000 UTC m=+461.622204919" observedRunningTime="2026-04-17 09:18:41.184643223 +0000 UTC m=+461.904141279" watchObservedRunningTime="2026-04-17 09:18:41.186411463 +0000 UTC m=+461.905909523" Apr 17 09:19:13.719736 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.719697 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z"] Apr 17 09:19:13.726297 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.726274 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.728858 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.728830 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 17 09:19:13.729003 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.728981 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 09:19:13.729109 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.729092 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 09:19:13.729152 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.729127 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 17 09:19:13.730322 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.730178 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-q8b86\"" Apr 17 09:19:13.731625 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.731603 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z"] Apr 17 09:19:13.898121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.898085 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ccc18fd7-1677-43d0-b311-4fae510b9f61-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.898121 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.898132 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58zjp\" (UniqueName: \"kubernetes.io/projected/ccc18fd7-1677-43d0-b311-4fae510b9f61-kube-api-access-58zjp\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.898351 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.898232 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc18fd7-1677-43d0-b311-4fae510b9f61-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.999467 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.999350 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc18fd7-1677-43d0-b311-4fae510b9f61-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.999467 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.999420 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ccc18fd7-1677-43d0-b311-4fae510b9f61-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:13.999467 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:13.999449 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58zjp\" (UniqueName: \"kubernetes.io/projected/ccc18fd7-1677-43d0-b311-4fae510b9f61-kube-api-access-58zjp\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:14.000044 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.000024 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/ccc18fd7-1677-43d0-b311-4fae510b9f61-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:14.001818 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.001790 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc18fd7-1677-43d0-b311-4fae510b9f61-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:14.008880 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.008849 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58zjp\" (UniqueName: \"kubernetes.io/projected/ccc18fd7-1677-43d0-b311-4fae510b9f61-kube-api-access-58zjp\") pod \"kubeflow-trainer-controller-manager-55f5694779-nzv9z\" (UID: \"ccc18fd7-1677-43d0-b311-4fae510b9f61\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:14.036833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.036792 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:14.161638 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.161603 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z"] Apr 17 09:19:14.164745 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:19:14.164710 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccc18fd7_1677_43d0_b311_4fae510b9f61.slice/crio-12b6162b30de3f5a9648c7a41988d9814ccf18e2babee15f560a2ec830a6bc94 WatchSource:0}: Error finding container 12b6162b30de3f5a9648c7a41988d9814ccf18e2babee15f560a2ec830a6bc94: Status 404 returned error can't find the container with id 12b6162b30de3f5a9648c7a41988d9814ccf18e2babee15f560a2ec830a6bc94 Apr 17 09:19:14.253522 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:14.253415 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" event={"ID":"ccc18fd7-1677-43d0-b311-4fae510b9f61","Type":"ContainerStarted","Data":"12b6162b30de3f5a9648c7a41988d9814ccf18e2babee15f560a2ec830a6bc94"} Apr 17 09:19:17.263115 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:17.263068 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" event={"ID":"ccc18fd7-1677-43d0-b311-4fae510b9f61","Type":"ContainerStarted","Data":"d0aff1fab61a48aff4d6d4a82c7d5befec5665dda1d1615336639db9cb6b7af5"} Apr 17 09:19:17.263579 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:17.263190 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:17.279374 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:17.279309 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" podStartSLOduration=1.849415093 podStartE2EDuration="4.279237815s" podCreationTimestamp="2026-04-17 09:19:13 +0000 UTC" firstStartedPulling="2026-04-17 09:19:14.166419679 +0000 UTC m=+494.885917720" lastFinishedPulling="2026-04-17 09:19:16.596242394 +0000 UTC m=+497.315740442" observedRunningTime="2026-04-17 09:19:17.278961803 +0000 UTC m=+497.998459863" watchObservedRunningTime="2026-04-17 09:19:17.279237815 +0000 UTC m=+497.998735873" Apr 17 09:19:33.270717 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:33.270679 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-nzv9z" Apr 17 09:19:49.499614 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:49.499535 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-nzv9z_ccc18fd7-1677-43d0-b311-4fae510b9f61/manager/0.log" Apr 17 09:19:49.930388 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:49.930361 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-nzv9z_ccc18fd7-1677-43d0-b311-4fae510b9f61/manager/0.log" Apr 17 09:19:50.363978 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:19:50.363950 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-nzv9z_ccc18fd7-1677-43d0-b311-4fae510b9f61/manager/0.log" Apr 17 09:20:25.308187 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.308150 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v8842/must-gather-6p2ww"] Apr 17 09:20:25.311226 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.311207 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.313935 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.313910 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"kube-root-ca.crt\"" Apr 17 09:20:25.314042 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.313970 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"openshift-service-ca.crt\"" Apr 17 09:20:25.319212 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.319187 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/must-gather-6p2ww"] Apr 17 09:20:25.371201 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.371154 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kzd\" (UniqueName: \"kubernetes.io/projected/0d659a48-c3d0-4250-90ca-31962520a8c6-kube-api-access-j7kzd\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.371381 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.371264 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d659a48-c3d0-4250-90ca-31962520a8c6-must-gather-output\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.471789 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.471749 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d659a48-c3d0-4250-90ca-31962520a8c6-must-gather-output\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.471996 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.471804 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kzd\" (UniqueName: \"kubernetes.io/projected/0d659a48-c3d0-4250-90ca-31962520a8c6-kube-api-access-j7kzd\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.472171 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.472149 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0d659a48-c3d0-4250-90ca-31962520a8c6-must-gather-output\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.480189 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.480154 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kzd\" (UniqueName: \"kubernetes.io/projected/0d659a48-c3d0-4250-90ca-31962520a8c6-kube-api-access-j7kzd\") pod \"must-gather-6p2ww\" (UID: \"0d659a48-c3d0-4250-90ca-31962520a8c6\") " pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.621090 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.621059 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/must-gather-6p2ww" Apr 17 09:20:25.739547 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:25.739514 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/must-gather-6p2ww"] Apr 17 09:20:25.743313 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:20:25.743279 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d659a48_c3d0_4250_90ca_31962520a8c6.slice/crio-fdc5b4cd94ffe3b1d3086326f256a24a30135dc496ae3e00ca1cbbb9ff0502ca WatchSource:0}: Error finding container fdc5b4cd94ffe3b1d3086326f256a24a30135dc496ae3e00ca1cbbb9ff0502ca: Status 404 returned error can't find the container with id fdc5b4cd94ffe3b1d3086326f256a24a30135dc496ae3e00ca1cbbb9ff0502ca Apr 17 09:20:26.449739 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:26.449702 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/must-gather-6p2ww" event={"ID":"0d659a48-c3d0-4250-90ca-31962520a8c6","Type":"ContainerStarted","Data":"fdc5b4cd94ffe3b1d3086326f256a24a30135dc496ae3e00ca1cbbb9ff0502ca"} Apr 17 09:20:27.455040 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:27.454957 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/must-gather-6p2ww" event={"ID":"0d659a48-c3d0-4250-90ca-31962520a8c6","Type":"ContainerStarted","Data":"2e8c04b919a88f60c3a5144f41d56b9790e65d219c8d56e6a782483c3535dfff"} Apr 17 09:20:27.455040 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:27.455006 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/must-gather-6p2ww" event={"ID":"0d659a48-c3d0-4250-90ca-31962520a8c6","Type":"ContainerStarted","Data":"e7da1ea12c5d535dd2e0cee408454240905a912bc5351c656c534f29386495f1"} Apr 17 09:20:27.470538 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:27.470460 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v8842/must-gather-6p2ww" podStartSLOduration=1.238584534 podStartE2EDuration="2.47044076s" podCreationTimestamp="2026-04-17 09:20:25 +0000 UTC" firstStartedPulling="2026-04-17 09:20:25.745362018 +0000 UTC m=+566.464860056" lastFinishedPulling="2026-04-17 09:20:26.977218248 +0000 UTC m=+567.696716282" observedRunningTime="2026-04-17 09:20:27.469889563 +0000 UTC m=+568.189387620" watchObservedRunningTime="2026-04-17 09:20:27.47044076 +0000 UTC m=+568.189938818" Apr 17 09:20:28.348455 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:28.348428 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vm9k6_c0db5b5a-0bf0-459d-ba1b-46054d880831/global-pull-secret-syncer/0.log" Apr 17 09:20:28.481239 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:28.481214 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8ntkx_4a59d22a-1ddf-48a4-b7d4-91e233d1c8b9/konnectivity-agent/0.log" Apr 17 09:20:28.613519 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:28.613471 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-18.ec2.internal_955792ba1322848ad185260c969a8a99/haproxy/0.log" Apr 17 09:20:32.343976 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:32.343947 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5dt8_16c665c3-2369-4f8c-ad19-990742a98173/node-exporter/0.log" Apr 17 09:20:32.365833 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:32.365801 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5dt8_16c665c3-2369-4f8c-ad19-990742a98173/kube-rbac-proxy/0.log" Apr 17 09:20:32.390089 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:32.390063 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5dt8_16c665c3-2369-4f8c-ad19-990742a98173/init-textfile/0.log" Apr 17 09:20:34.158184 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:34.158148 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-m7rcj_8381b0dc-c82c-4102-aa7e-97c8fec5ffc8/networking-console-plugin/0.log" Apr 17 09:20:35.138393 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.138354 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n"] Apr 17 09:20:35.141566 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.141537 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.144745 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.144721 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-v8842\"/\"default-dockercfg-mxnlg\"" Apr 17 09:20:35.154446 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.154415 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n"] Apr 17 09:20:35.252063 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.252018 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcp97\" (UniqueName: \"kubernetes.io/projected/e1f3d70a-7954-491c-b307-cf861df879ca-kube-api-access-mcp97\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.252479 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.252073 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-podres\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.252479 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.252130 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-sys\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.252479 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.252152 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-proc\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.252479 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.252231 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-lib-modules\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353448 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353407 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcp97\" (UniqueName: \"kubernetes.io/projected/e1f3d70a-7954-491c-b307-cf861df879ca-kube-api-access-mcp97\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353659 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353460 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-podres\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353659 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353565 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-sys\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353659 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353598 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-proc\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353659 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353645 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-lib-modules\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353858 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353672 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-podres\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353858 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353677 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-sys\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353858 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353717 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-proc\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.353858 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.353800 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1f3d70a-7954-491c-b307-cf861df879ca-lib-modules\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.363932 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.363903 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcp97\" (UniqueName: \"kubernetes.io/projected/e1f3d70a-7954-491c-b307-cf861df879ca-kube-api-access-mcp97\") pod \"perf-node-gather-daemonset-ktd4n\" (UID: \"e1f3d70a-7954-491c-b307-cf861df879ca\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.451936 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.451842 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:35.591928 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:35.591829 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n"] Apr 17 09:20:35.594829 ip-10-0-143-18 kubenswrapper[2606]: W0417 09:20:35.594787 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode1f3d70a_7954_491c_b307_cf861df879ca.slice/crio-46609e33f10640cd73e4d6662b8afe61cf94041bdf9dbaeec9ea2d6eb6aaf6a2 WatchSource:0}: Error finding container 46609e33f10640cd73e4d6662b8afe61cf94041bdf9dbaeec9ea2d6eb6aaf6a2: Status 404 returned error can't find the container with id 46609e33f10640cd73e4d6662b8afe61cf94041bdf9dbaeec9ea2d6eb6aaf6a2 Apr 17 09:20:36.089140 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.089046 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6c4sw_cb63c154-05d7-4e31-bcd5-6b2da69d604c/dns/0.log" Apr 17 09:20:36.116892 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.116864 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6c4sw_cb63c154-05d7-4e31-bcd5-6b2da69d604c/kube-rbac-proxy/0.log" Apr 17 09:20:36.296551 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.296522 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wxmnk_0a1e21cd-b620-41c1-9c1d-4e6fba3925ef/dns-node-resolver/0.log" Apr 17 09:20:36.489472 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.489439 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" event={"ID":"e1f3d70a-7954-491c-b307-cf861df879ca","Type":"ContainerStarted","Data":"392f91e4fa00b9b4f8886d9b5423f72b9e75fee5fe9856508ff6cace84a10c8e"} Apr 17 09:20:36.489472 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.489474 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" event={"ID":"e1f3d70a-7954-491c-b307-cf861df879ca","Type":"ContainerStarted","Data":"46609e33f10640cd73e4d6662b8afe61cf94041bdf9dbaeec9ea2d6eb6aaf6a2"} Apr 17 09:20:36.489710 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.489618 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:36.507428 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.507367 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" podStartSLOduration=1.507349312 podStartE2EDuration="1.507349312s" podCreationTimestamp="2026-04-17 09:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:20:36.505978002 +0000 UTC m=+577.225476083" watchObservedRunningTime="2026-04-17 09:20:36.507349312 +0000 UTC m=+577.226847370" Apr 17 09:20:36.709278 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:36.709253 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g4rrd_e4f475ef-a8f7-4778-86ea-42db013683b6/node-ca/0.log" Apr 17 09:20:37.833933 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:37.833906 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-snccl_2e6cf676-b554-4b97-99de-d1ff810ef911/serve-healthcheck-canary/0.log" Apr 17 09:20:38.275029 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:38.275000 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88rzd_4ead68b2-4f68-4605-9c1e-c4f73c735c33/kube-rbac-proxy/0.log" Apr 17 09:20:38.295842 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:38.295817 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88rzd_4ead68b2-4f68-4605-9c1e-c4f73c735c33/exporter/0.log" Apr 17 09:20:38.319617 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:38.319583 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-88rzd_4ead68b2-4f68-4605-9c1e-c4f73c735c33/extractor/0.log" Apr 17 09:20:40.018713 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:40.018681 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-r98dc_84142610-6876-4e79-a209-dd2b4391f92b/jobset-operator/0.log" Apr 17 09:20:42.506405 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:42.506351 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-ktd4n" Apr 17 09:20:44.759090 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.758938 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/kube-multus-additional-cni-plugins/0.log" Apr 17 09:20:44.786523 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.786485 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/egress-router-binary-copy/0.log" Apr 17 09:20:44.811234 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.811207 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/cni-plugins/0.log" Apr 17 09:20:44.833953 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.833927 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/bond-cni-plugin/0.log" Apr 17 09:20:44.859689 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.859659 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/routeoverride-cni/0.log" Apr 17 09:20:44.881339 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.881306 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/whereabouts-cni-bincopy/0.log" Apr 17 09:20:44.903814 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:44.903788 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2jnqg_07361f70-d7d2-4866-8e18-3bd88e02229e/whereabouts-cni/0.log" Apr 17 09:20:45.280602 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:45.280570 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5sb9_fdf1115b-531b-4095-86b7-8ac9a436dd2f/kube-multus/0.log" Apr 17 09:20:45.302779 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:45.302751 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2w27v_ef0a40f6-04de-4672-9770-be487916c08b/network-metrics-daemon/0.log" Apr 17 09:20:45.327379 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:45.327345 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2w27v_ef0a40f6-04de-4672-9770-be487916c08b/kube-rbac-proxy/0.log" Apr 17 09:20:46.487907 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.487825 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-controller/0.log" Apr 17 09:20:46.507930 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.507893 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/0.log" Apr 17 09:20:46.511410 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.511383 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovn-acl-logging/1.log" Apr 17 09:20:46.534448 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.534415 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/kube-rbac-proxy-node/0.log" Apr 17 09:20:46.562078 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.562051 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 09:20:46.581590 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.581560 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/northd/0.log" Apr 17 09:20:46.605834 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.605808 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/nbdb/0.log" Apr 17 09:20:46.628289 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.628265 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/sbdb/0.log" Apr 17 09:20:46.727223 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:46.727190 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pczcn_fd082fd1-df83-4c85-ba4b-b7b20f551f67/ovnkube-controller/0.log" Apr 17 09:20:48.070878 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:48.070850 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7l5df_252902a0-d0ed-496b-bcbb-6dc20ec3c9d4/network-check-target-container/0.log" Apr 17 09:20:49.035942 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:49.035912 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mnb85_dfb32cbd-12dc-4c1b-add8-12bb56a19a40/iptables-alerter/0.log" Apr 17 09:20:49.657865 ip-10-0-143-18 kubenswrapper[2606]: I0417 09:20:49.657821 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dcjr8_a16122cc-e651-4705-adba-ac8adbc56a48/tuned/0.log"