Apr 21 03:54:42.924765 ip-10-0-131-93 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 03:54:42.924776 ip-10-0-131-93 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 03:54:42.924783 ip-10-0-131-93 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 03:54:42.924997 ip-10-0-131-93 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 03:54:53.067189 ip-10-0-131-93 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 03:54:53.067208 ip-10-0-131-93 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 654e4b2e93cf407db9f2c8b1d183ea56 -- Apr 21 03:57:16.451322 ip-10-0-131-93 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:57:16.905234 ip-10-0-131-93 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:16.905234 ip-10-0-131-93 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:57:16.905234 ip-10-0-131-93 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:16.905234 ip-10-0-131-93 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:57:16.905234 ip-10-0-131-93 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:16.907344 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.907182 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:57:16.909648 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909633 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:16.909648 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909649 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909653 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909656 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909659 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909662 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909665 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909667 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909670 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909674 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909678 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909681 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909683 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909686 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909689 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909691 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909694 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909697 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909700 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909702 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909705 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:16.909710 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909708 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909711 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909713 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909719 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909722 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909725 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909727 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909732 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909736 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909739 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909742 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909745 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909748 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909751 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909754 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909757 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909760 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909762 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:16.910198 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909765 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909767 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909770 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909772 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909775 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909777 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909780 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909782 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909785 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909787 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909790 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909792 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909795 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909798 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909800 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909803 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909807 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909810 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909812 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:16.910662 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909815 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909818 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909820 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909822 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909825 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909828 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909830 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909833 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909835 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909837 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909840 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909842 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909845 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909847 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909850 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909854 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909856 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909859 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909862 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909864 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909867 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:16.911147 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909869 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909872 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909875 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909877 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909880 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909882 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:16.911655 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.909885 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:16.912912 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912899 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:16.912912 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912911 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912915 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912918 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912922 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912925 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912927 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912930 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912933 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912935 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912938 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912940 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912943 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912946 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912948 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912951 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912953 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912956 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912958 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912961 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912963 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:16.912975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912966 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912969 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912971 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912974 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912976 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912979 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912982 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912985 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912988 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912990 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912993 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912995 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.912998 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913000 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913003 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913005 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913008 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913010 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913013 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913015 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:16.913474 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913018 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913020 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913023 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913025 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913028 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913030 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913045 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913048 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913050 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913053 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913055 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913057 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913060 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913063 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913068 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913072 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913075 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913078 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913081 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:16.913975 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913084 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913086 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913089 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913092 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913094 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913097 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913099 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913102 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913104 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913107 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913109 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913112 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913115 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913117 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913120 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913124 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913127 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913130 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913134 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:16.914472 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913136 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913139 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913141 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913144 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913146 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913149 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913152 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913222 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913238 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913248 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913254 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913261 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913265 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913270 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913274 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913278 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913281 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913284 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913288 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913291 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913294 2570 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913297 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913300 2570 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:57:16.914934 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913302 2570 flags.go:64] FLAG: --cloud-config="" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913305 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913308 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913313 2570 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913316 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913320 2570 flags.go:64] FLAG: --config-dir="" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913323 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913326 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913331 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913333 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913337 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913340 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913343 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913346 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913350 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913353 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913356 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913361 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913365 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913368 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913371 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913374 2570 flags.go:64] FLAG: --enable-server="true" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913377 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913381 2570 flags.go:64] FLAG: --event-burst="100" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913384 2570 flags.go:64] FLAG: --event-qps="50" Apr 21 03:57:16.915503 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913387 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913390 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913393 2570 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913396 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913399 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913402 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913405 2570 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913408 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913411 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913414 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913417 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913420 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913423 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913426 2570 flags.go:64] FLAG: --feature-gates="" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913429 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913432 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913435 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913438 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913441 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913444 2570 flags.go:64] FLAG: --help="false" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913447 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-131-93.ec2.internal" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913450 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913454 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:57:16.916122 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913457 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913460 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913463 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913467 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913470 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913473 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913476 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913479 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913482 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913486 2570 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913489 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913492 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913495 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913498 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913500 2570 flags.go:64] FLAG: --lock-file="" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913503 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913506 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913509 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913514 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913517 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913520 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913523 2570 flags.go:64] FLAG: --logging-format="text" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913526 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913529 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:57:16.916791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913532 2570 flags.go:64] FLAG: --manifest-url="" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913535 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913539 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913542 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913547 2570 flags.go:64] FLAG: --max-pods="110" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913550 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913553 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913556 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913559 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913562 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913565 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913568 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913576 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913579 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913582 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913585 2570 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913588 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913594 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913597 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913600 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913603 2570 flags.go:64] FLAG: --port="10250" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913606 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913609 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09892c0c8d331496a" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913612 2570 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:57:16.917381 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913615 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913618 2570 flags.go:64] FLAG: --register-node="true" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913621 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913624 2570 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913630 2570 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913633 2570 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913636 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913639 2570 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913643 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913648 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913652 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913655 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913658 2570 flags.go:64] FLAG: --runonce="false" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913661 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913664 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913668 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913671 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913673 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913677 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913680 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913683 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913686 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913689 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913692 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913695 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913698 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:57:16.917957 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913701 2570 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913704 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913709 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913712 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913715 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913719 2570 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913722 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913725 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913728 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913730 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913733 2570 flags.go:64] FLAG: --v="2" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913738 2570 flags.go:64] FLAG: --version="false" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913742 2570 flags.go:64] FLAG: --vmodule="" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913746 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.913749 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913836 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913839 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913842 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913845 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913848 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913851 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913854 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:16.918626 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913858 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913862 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913865 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913867 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913870 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913872 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913876 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913879 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913882 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913884 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913887 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913890 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913893 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913895 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913898 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913900 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913903 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913905 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913908 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913910 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:16.919169 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913913 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913915 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913918 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913920 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913923 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913928 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913930 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913932 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913935 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913938 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913940 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913943 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913946 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913948 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913951 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913955 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913959 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913963 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913967 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913972 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:16.919679 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913977 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913980 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913982 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913985 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913988 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913990 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913993 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913995 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.913998 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914000 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914003 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914005 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914008 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914010 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914013 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914015 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914018 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914021 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914024 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:16.920257 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914026 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914029 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914032 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914049 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914052 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914055 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914058 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914061 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914063 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914066 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914068 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914071 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914074 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914077 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914080 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914082 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914085 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914087 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914091 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:16.920724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.914095 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:16.921193 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.914889 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:16.921364 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.921346 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:57:16.921397 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.921365 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:57:16.921424 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921415 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:16.921424 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921421 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:16.921424 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921424 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921428 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921431 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921434 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921437 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921440 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921443 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921446 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921449 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921451 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921454 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921456 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921459 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921462 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921464 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921467 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921470 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921472 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921475 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921478 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:16.921505 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921481 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921483 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921486 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921488 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921492 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921494 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921497 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921499 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921502 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921504 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921507 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921511 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921516 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921519 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921522 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921525 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921528 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921530 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921533 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921536 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:16.921997 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921538 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921541 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921544 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921547 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921549 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921552 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921555 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921557 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921560 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921563 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921566 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921569 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921572 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921575 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921578 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921580 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921583 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921586 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921588 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:16.922498 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921591 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921594 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921596 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921599 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921601 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921604 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921606 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921609 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921611 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921614 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921616 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921619 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921622 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921625 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921627 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921630 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921632 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921635 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921637 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921640 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:16.922961 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921642 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921645 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921649 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921652 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921655 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.921661 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921761 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921766 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921769 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921772 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921775 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921777 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921780 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921783 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921786 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:16.923466 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921788 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921791 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921794 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921796 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921799 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921803 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921807 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921809 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921812 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921815 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921818 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921820 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921823 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921826 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921828 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921831 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921833 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921836 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921838 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921841 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:16.923830 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921843 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921846 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921848 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921851 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921854 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921857 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921859 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921862 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921864 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921867 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921869 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921873 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921877 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921880 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921883 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921886 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921888 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921891 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921893 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921896 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:16.924344 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921898 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921901 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921904 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921907 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921909 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921911 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921914 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921917 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921919 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921922 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921924 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921927 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921929 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921932 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921934 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921937 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921940 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921942 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921945 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921948 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:16.924827 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921950 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921953 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921955 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921958 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921960 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921962 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921965 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921967 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921970 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921973 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921975 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921978 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921980 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921983 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921985 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921988 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:16.925395 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:16.921990 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.921995 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.922755 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.924706 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.925641 2570 server.go:1019] "Starting client certificate rotation" Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.925738 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:16.925789 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.925779 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:16.950507 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.950487 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:16.955353 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.955338 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:16.970589 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.970566 2570 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:57:16.976654 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.976627 2570 log.go:25] "Validated CRI v1 image API" Apr 21 03:57:16.979444 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.979418 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:16.979919 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.979905 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:57:16.982603 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.982578 2570 fs.go:135] Filesystem UUIDs: map[29fe20e0-e37d-4f72-867d-a9e3401ae55b:/dev/nvme0n1p4 752c12f7-8913-4ad0-a815-d2fab979d140:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 03:57:16.982667 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.982602 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:57:16.987549 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.987440 2570 manager.go:217] Machine: {Timestamp:2026-04-21 03:57:16.986347749 +0000 UTC m=+0.414119636 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099968 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22d53fdf42df59cfbcbff8d7ebf0bf SystemUUID:ec22d53f-df42-df59-cfbc-bff8d7ebf0bf BootID:654e4b2e-93cf-407d-b9f2-c8b1d183ea56 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c0:00:7b:37:1b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c0:00:7b:37:1b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:b7:b8:05:1e:57 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:57:16.987549 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.987544 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:57:16.987676 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.987665 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:57:16.990263 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990236 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:57:16.990420 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990265 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-93.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:57:16.990462 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990430 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:57:16.990462 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990440 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:57:16.990462 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990452 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:16.990544 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.990469 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:16.991340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.991330 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:16.991441 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.991432 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:57:16.994011 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.994002 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:57:16.994070 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.994016 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:57:16.994726 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.994716 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:57:16.994761 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.994730 2570 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:57:16.994761 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.994740 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:57:16.996111 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.996098 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:16.996161 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.996118 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:16.999015 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:16.998996 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:57:17.000875 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.000861 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:57:17.002277 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002259 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002285 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002294 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002303 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002316 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002328 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002335 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:57:17.002340 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002342 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:57:17.002596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002349 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:57:17.002596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002355 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:57:17.002596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002363 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:57:17.002596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.002372 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:57:17.003303 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.003291 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:57:17.003303 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.003301 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:57:17.006576 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.006551 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-93.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:57:17.006708 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.006670 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:57:17.006753 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.006709 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:57:17.007156 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.007144 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:57:17.007193 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.007180 2570 server.go:1295] "Started kubelet" Apr 21 03:57:17.007278 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.007256 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:57:17.007375 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.007330 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:57:17.007428 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.007409 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:57:17.010459 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.010432 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:57:17.010641 ip-10-0-131-93 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:57:17.011319 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.011303 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:57:17.017448 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.016470 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-93.ec2.internal.18a8431b2dad140c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-93.ec2.internal,UID:ip-10-0-131-93.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-93.ec2.internal,},FirstTimestamp:2026-04-21 03:57:17.007156236 +0000 UTC m=+0.434928102,LastTimestamp:2026-04-21 03:57:17.007156236 +0000 UTC m=+0.434928102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-93.ec2.internal,}" Apr 21 03:57:17.019791 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.019770 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5wbzz" Apr 21 03:57:17.019978 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.019950 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:57:17.020047 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.019977 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:57:17.020047 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.019995 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:17.020653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020633 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:57:17.020653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020655 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:57:17.020832 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020819 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:57:17.020832 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020831 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:57:17.020955 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020880 2570 factory.go:55] Registering systemd factory Apr 21 03:57:17.020955 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020928 2570 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:57:17.021047 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.020958 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.021047 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.020997 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:57:17.021250 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021233 2570 factory.go:153] Registering CRI-O factory Apr 21 03:57:17.021250 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021252 2570 factory.go:223] Registration of the crio container factory successfully Apr 21 03:57:17.021379 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021326 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:57:17.021379 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021368 2570 factory.go:103] Registering Raw factory Apr 21 03:57:17.021484 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021394 2570 manager.go:1196] Started watching for new ooms in manager Apr 21 03:57:17.021774 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.021760 2570 manager.go:319] Starting recovery of all containers Apr 21 03:57:17.022167 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.022112 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 03:57:17.022268 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.022246 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 03:57:17.026913 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.026842 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5wbzz" Apr 21 03:57:17.032154 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.032059 2570 manager.go:324] Recovery completed Apr 21 03:57:17.036495 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.036482 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.040927 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.040912 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.040984 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.040948 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.040984 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.040961 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.041439 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.041426 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:57:17.041439 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.041437 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:57:17.041531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.041468 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:17.043875 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.043857 2570 policy_none.go:49] "None policy: Start" Apr 21 03:57:17.043875 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.043873 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:57:17.043875 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.043882 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.086410 2570 manager.go:341] "Starting Device Plugin manager" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.086596 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.086613 2570 server.go:85] "Starting device plugin registration server" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.086833 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.086844 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.086955 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.087065 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.087073 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.087574 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:57:17.092647 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.087605 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.154253 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.154224 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:57:17.155532 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.155471 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:57:17.155532 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.155503 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:57:17.155532 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.155527 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:57:17.155724 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.155537 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:57:17.155724 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.155578 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:57:17.158907 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.158891 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:17.187850 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.187827 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.188764 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.188747 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.188863 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.188781 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.188863 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.188796 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.188863 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.188826 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.197652 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.197636 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.197726 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.197658 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-93.ec2.internal\": node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.207901 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.207884 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.256136 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.256108 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal"] Apr 21 03:57:17.256254 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.256179 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.257126 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.257110 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.257223 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.257144 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.257223 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.257157 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.258413 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.258398 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.258545 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.258530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.258596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.258561 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.259171 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259156 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.259171 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259168 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.259301 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259187 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.259301 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259203 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.259301 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259190 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.259301 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.259272 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.260419 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.260403 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.260493 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.260427 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:17.261071 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.261054 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:17.261141 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.261088 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:17.261141 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.261103 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:17.284698 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.284680 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-93.ec2.internal\" not found" node="ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.289163 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.289146 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-93.ec2.internal\" not found" node="ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.308843 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.308826 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.409566 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.409499 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.422922 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.422898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.422991 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.422929 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/364f28bb57f65923dbeacedd6b253c36-config\") pod \"kube-apiserver-proxy-ip-10-0-131-93.ec2.internal\" (UID: \"364f28bb57f65923dbeacedd6b253c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.422991 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.422947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.510344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.510306 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.523716 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.523777 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.523777 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/364f28bb57f65923dbeacedd6b253c36-config\") pod \"kube-apiserver-proxy-ip-10-0-131-93.ec2.internal\" (UID: \"364f28bb57f65923dbeacedd6b253c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.523841 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523808 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.523873 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5861d034824f0ed241e7223fb92ee95e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal\" (UID: \"5861d034824f0ed241e7223fb92ee95e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.523905 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.523888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/364f28bb57f65923dbeacedd6b253c36-config\") pod \"kube-apiserver-proxy-ip-10-0-131-93.ec2.internal\" (UID: \"364f28bb57f65923dbeacedd6b253c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.586891 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.586847 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.591569 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.591550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:17.611264 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.611234 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.711939 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.711849 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.812444 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.812410 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.913062 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:17.913020 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:17.925390 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.925363 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:57:17.925542 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:17.925523 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:18.013762 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.013735 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.020735 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.020712 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:18.029347 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.029317 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:52:17 +0000 UTC" deadline="2027-11-20 11:13:53.411496368 +0000 UTC" Apr 21 03:57:18.029347 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.029344 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13879h16m35.382154856s" Apr 21 03:57:18.031431 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.031416 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:18.050165 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.050134 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mxj5b" Apr 21 03:57:18.058825 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:18.058792 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5861d034824f0ed241e7223fb92ee95e.slice/crio-d8e8ffa44ea4925cbfda15baf81fd9a69a564e9a8dbb0c85eb7936deb4746f83 WatchSource:0}: Error finding container d8e8ffa44ea4925cbfda15baf81fd9a69a564e9a8dbb0c85eb7936deb4746f83: Status 404 returned error can't find the container with id d8e8ffa44ea4925cbfda15baf81fd9a69a564e9a8dbb0c85eb7936deb4746f83 Apr 21 03:57:18.059407 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:18.059388 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod364f28bb57f65923dbeacedd6b253c36.slice/crio-323c2ac02e602ba6b94cfa9eeb07d9d56e8c3eebfc5a0768003a45984833a3ff WatchSource:0}: Error finding container 323c2ac02e602ba6b94cfa9eeb07d9d56e8c3eebfc5a0768003a45984833a3ff: Status 404 returned error can't find the container with id 323c2ac02e602ba6b94cfa9eeb07d9d56e8c3eebfc5a0768003a45984833a3ff Apr 21 03:57:18.059608 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.059593 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mxj5b" Apr 21 03:57:18.064222 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.064207 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:57:18.114641 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.114591 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.158858 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.158803 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" event={"ID":"5861d034824f0ed241e7223fb92ee95e","Type":"ContainerStarted","Data":"d8e8ffa44ea4925cbfda15baf81fd9a69a564e9a8dbb0c85eb7936deb4746f83"} Apr 21 03:57:18.159755 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.159732 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" event={"ID":"364f28bb57f65923dbeacedd6b253c36","Type":"ContainerStarted","Data":"323c2ac02e602ba6b94cfa9eeb07d9d56e8c3eebfc5a0768003a45984833a3ff"} Apr 21 03:57:18.214980 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.214895 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.265535 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.265506 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:18.315538 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.315499 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.319682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.319667 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:18.415601 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.415571 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.516449 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:18.516366 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-93.ec2.internal\" not found" Apr 21 03:57:18.610299 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.610266 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:18.620680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.620653 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" Apr 21 03:57:18.632158 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.632122 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:18.633232 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.633215 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" Apr 21 03:57:18.640563 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.640542 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:18.996060 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:18.995961 2570 apiserver.go:52] "Watching apiserver" Apr 21 03:57:19.004257 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.003809 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:57:19.004700 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.004675 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5","openshift-cluster-node-tuning-operator/tuned-65jhd","openshift-dns/node-resolver-26mzt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal","openshift-multus/multus-zrwkk","openshift-multus/network-metrics-daemon-lf2dl","openshift-image-registry/node-ca-jw8wp","openshift-multus/multus-additional-cni-plugins-xddzt","openshift-network-diagnostics/network-check-target-ztjfh","openshift-network-operator/iptables-alerter-26xqh","openshift-ovn-kubernetes/ovnkube-node-6p6fc","kube-system/konnectivity-agent-kmbjr","kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal"] Apr 21 03:57:19.006320 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.006298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.008539 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008516 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.008635 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008614 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.008696 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008646 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.008696 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008518 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:57:19.008792 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008737 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.008980 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.008873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6cd8b\"" Apr 21 03:57:19.009932 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.009914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010624 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010729 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lh26x\"" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010940 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010949 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mplvf\"" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010984 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.011169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.010940 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.012158 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012139 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:57:19.012246 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012159 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.012246 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012146 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:57:19.012404 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012387 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.012477 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012463 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.012477 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.012464 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:19.012577 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.012393 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zlc75\"" Apr 21 03:57:19.013835 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.013798 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.013935 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.013916 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.015082 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.015068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:19.015154 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.015134 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:19.015850 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.015826 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:57:19.015963 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.015946 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:57:19.016047 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.015999 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.016238 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.016221 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bsm59\"" Apr 21 03:57:19.016371 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.016354 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:57:19.016451 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.016435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6l8s2\"" Apr 21 03:57:19.016512 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.016486 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.017703 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.017686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.017846 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.017831 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.018981 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.018961 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.019465 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.019445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:57:19.019594 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.019550 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zkdlh\"" Apr 21 03:57:19.019741 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.019718 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.020001 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.019985 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.020141 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.020123 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:57:19.020509 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.020483 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:57:19.020509 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.020490 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:57:19.020641 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.020505 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:57:19.020641 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.020541 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:57:19.021763 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.021743 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:57:19.022476 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.022012 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:57:19.022476 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.022161 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4f8rx\"" Apr 21 03:57:19.022476 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.022224 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p2fp7\"" Apr 21 03:57:19.022476 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.022382 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:57:19.022476 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.022431 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:57:19.032007 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.031988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stvkw\" (UniqueName: \"kubernetes.io/projected/114ff0f2-95bc-49bb-be65-079af4d8294d-kube-api-access-stvkw\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.032113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032012 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-serviceca\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.032113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-netns\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.032113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-var-lib-kubelet\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.032113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8znnq\" (UniqueName: \"kubernetes.io/projected/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-kube-api-access-8znnq\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-run\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032151 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9v4\" (UniqueName: \"kubernetes.io/projected/26270642-aaa5-4b43-804b-56317d766266-kube-api-access-gk9v4\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032175 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-cni-binary-copy\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-etc-kubernetes\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-device-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.032307 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-var-lib-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vm4\" (UniqueName: \"kubernetes.io/projected/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-kube-api-access-p2vm4\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032331 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-socket-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-script-lib\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-etc-tuned\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-etc-selinux\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-sys-fs\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032508 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c98b9edd-432f-4ffa-a024-8b9f651147e0-agent-certs\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.032531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a2e2e97-0696-4765-bf51-be31ec3c66ba-host-slash\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-modprobe-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-bin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-host\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-etc-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-sys\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-host\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b55qp\" (UniqueName: \"kubernetes.io/projected/8a8c6dad-8135-4d60-b437-56307544e064-kube-api-access-b55qp\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.032815 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-system-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032840 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-system-cni-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l27g\" (UniqueName: \"kubernetes.io/projected/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kube-api-access-7l27g\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-systemd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032919 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-systemd\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.032966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-socket-dir-parent\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-multus-certs\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-ovn\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-env-overrides\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033109 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-lib-modules\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.033143 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-cnibin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033151 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-daemon-config\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-os-release\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-registration-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-kubelet\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-hostroot\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw5dk\" (UniqueName: \"kubernetes.io/projected/0642b1aa-ff76-4694-bad0-be2656b81005-kube-api-access-mw5dk\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-systemd-units\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-bin\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysconfig\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.033570 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033575 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-netns\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.034169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cnibin\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.034169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033621 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-log-socket\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.033965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034169 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034133 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749sr\" (UniqueName: \"kubernetes.io/projected/3a284a31-b4ea-4280-a6d1-b84390d1488d-kube-api-access-749sr\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034318 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-kubernetes\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.034318 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a8c6dad-8135-4d60-b437-56307544e064-hosts-file\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.034318 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034241 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-k8s-cni-cncf-io\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.034318 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:19.034318 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-kubelet\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034534 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034534 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.034534 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-conf\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.034682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-multus\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.034682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-netd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034626 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a8c6dad-8135-4d60-b437-56307544e064-tmp-dir\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.034682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-slash\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034862 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-node-log\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034862 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxd8k\" (UniqueName: \"kubernetes.io/projected/6a2e2e97-0696-4765-bf51-be31ec3c66ba-kube-api-access-dxd8k\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.034862 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034804 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-os-release\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.034862 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-config\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.034862 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034858 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovn-node-metrics-cert\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.035129 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c98b9edd-432f-4ffa-a024-8b9f651147e0-konnectivity-ca\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.035129 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034921 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6a2e2e97-0696-4765-bf51-be31ec3c66ba-iptables-alerter-script\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.035129 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.034985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-tmp\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.035129 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.035017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-conf-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.060805 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.060778 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:18 +0000 UTC" deadline="2027-11-18 01:26:10.209890444 +0000 UTC" Apr 21 03:57:19.060805 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.060804 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13821h28m51.149088995s" Apr 21 03:57:19.135677 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b55qp\" (UniqueName: \"kubernetes.io/projected/8a8c6dad-8135-4d60-b437-56307544e064-kube-api-access-b55qp\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-system-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-system-cni-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l27g\" (UniqueName: \"kubernetes.io/projected/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kube-api-access-7l27g\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-systemd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-systemd\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-system-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.135855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-socket-dir-parent\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-systemd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-system-cni-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-multus-certs\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-systemd\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.135923 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-multus-certs\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-ovn\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136050 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-socket-dir-parent\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-env-overrides\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136080 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-cni-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-ovn\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-lib-modules\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-cnibin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-daemon-config\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-os-release\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-cnibin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.136297 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-lib-modules\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-registration-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-kubelet\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.136299 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136317 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-hostroot\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw5dk\" (UniqueName: \"kubernetes.io/projected/0642b1aa-ff76-4694-bad0-be2656b81005-kube-api-access-mw5dk\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.136375 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:19.636352567 +0000 UTC m=+3.064124434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-systemd-units\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-bin\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysconfig\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136555 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-systemd-units\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysconfig\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-netns\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136595 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-env-overrides\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137121 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-os-release\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-bin\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cnibin\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-netns\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-log-socket\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cnibin\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136603 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-run-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136726 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-749sr\" (UniqueName: \"kubernetes.io/projected/3a284a31-b4ea-4280-a6d1-b84390d1488d-kube-api-access-749sr\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-hostroot\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-kubernetes\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a8c6dad-8135-4d60-b437-56307544e064-hosts-file\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136813 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-daemon-config\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-kubelet\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-k8s-cni-cncf-io\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-registration-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.137882 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-log-socket\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-kubernetes\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.136984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-kubelet\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-conf\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-multus\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-netd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a8c6dad-8135-4d60-b437-56307544e064-tmp-dir\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-kubelet\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a8c6dad-8135-4d60-b437-56307544e064-hosts-file\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137263 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-run-k8s-cni-cncf-io\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-slash\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.138680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-conf\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-node-log\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxd8k\" (UniqueName: \"kubernetes.io/projected/6a2e2e97-0696-4765-bf51-be31ec3c66ba-kube-api-access-dxd8k\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-multus\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-os-release\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-slash\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137378 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-node-log\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-sysctl-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-cni-netd\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-os-release\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137481 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-config\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovn-node-metrics-cert\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a8c6dad-8135-4d60-b437-56307544e064-tmp-dir\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137539 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c98b9edd-432f-4ffa-a024-8b9f651147e0-konnectivity-ca\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137566 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6a2e2e97-0696-4765-bf51-be31ec3c66ba-iptables-alerter-script\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-tmp\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137617 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-conf-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.139455 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stvkw\" (UniqueName: \"kubernetes.io/projected/114ff0f2-95bc-49bb-be65-079af4d8294d-kube-api-access-stvkw\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-serviceca\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-netns\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-var-lib-kubelet\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8znnq\" (UniqueName: \"kubernetes.io/projected/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-kube-api-access-8znnq\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-run\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9v4\" (UniqueName: \"kubernetes.io/projected/26270642-aaa5-4b43-804b-56317d766266-kube-api-access-gk9v4\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-cni-binary-copy\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-etc-kubernetes\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137878 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-device-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.137981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-var-lib-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vm4\" (UniqueName: \"kubernetes.io/projected/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-kube-api-access-p2vm4\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138050 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c98b9edd-432f-4ffa-a024-8b9f651147e0-konnectivity-ca\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138123 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6a2e2e97-0696-4765-bf51-be31ec3c66ba-iptables-alerter-script\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-etc-kubernetes\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-run\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.140331 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-socket-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-script-lib\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-etc-tuned\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-etc-selinux\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-config\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-sys-fs\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c98b9edd-432f-4ffa-a024-8b9f651147e0-agent-certs\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-device-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-var-lib-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-sys-fs\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-var-lib-kubelet\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138696 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/114ff0f2-95bc-49bb-be65-079af4d8294d-cni-binary-copy\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-host-run-netns\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a2e2e97-0696-4765-bf51-be31ec3c66ba-host-slash\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-modprobe-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-etc-selinux\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1512c-1a5f-42db-95f2-03f360fed0a5-socket-dir\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.141144 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-bin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-etc-modprobe-d\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a2e2e97-0696-4765-bf51-be31ec3c66ba-host-slash\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-host-var-lib-cni-bin\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/114ff0f2-95bc-49bb-be65-079af4d8294d-multus-conf-dir\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.138942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-host\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-etc-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139147 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-host\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-serviceca\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovnkube-script-lib\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a284a31-b4ea-4280-a6d1-b84390d1488d-etc-openvswitch\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-sys\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-host\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139652 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-sys\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26270642-aaa5-4b43-804b-56317d766266-host\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.139752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.141730 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.141704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-tmp\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.142493 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.141723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26270642-aaa5-4b43-804b-56317d766266-etc-tuned\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.142493 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.142113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a284a31-b4ea-4280-a6d1-b84390d1488d-ovn-node-metrics-cert\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.145217 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.145191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c98b9edd-432f-4ffa-a024-8b9f651147e0-agent-certs\") pod \"konnectivity-agent-kmbjr\" (UID: \"c98b9edd-432f-4ffa-a024-8b9f651147e0\") " pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.146271 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.146247 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:19.146390 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.146276 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:19.146390 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.146292 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:19.146390 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.146362 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:19.646342795 +0000 UTC m=+3.074114664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:19.147510 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.147483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l27g\" (UniqueName: \"kubernetes.io/projected/4ec1512c-1a5f-42db-95f2-03f360fed0a5-kube-api-access-7l27g\") pod \"aws-ebs-csi-driver-node-jrpm5\" (UID: \"4ec1512c-1a5f-42db-95f2-03f360fed0a5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.147596 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.147566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b55qp\" (UniqueName: \"kubernetes.io/projected/8a8c6dad-8135-4d60-b437-56307544e064-kube-api-access-b55qp\") pod \"node-resolver-26mzt\" (UID: \"8a8c6dad-8135-4d60-b437-56307544e064\") " pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.149218 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.149005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9v4\" (UniqueName: \"kubernetes.io/projected/26270642-aaa5-4b43-804b-56317d766266-kube-api-access-gk9v4\") pod \"tuned-65jhd\" (UID: \"26270642-aaa5-4b43-804b-56317d766266\") " pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.149218 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.149175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw5dk\" (UniqueName: \"kubernetes.io/projected/0642b1aa-ff76-4694-bad0-be2656b81005-kube-api-access-mw5dk\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.149367 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.149316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-749sr\" (UniqueName: \"kubernetes.io/projected/3a284a31-b4ea-4280-a6d1-b84390d1488d-kube-api-access-749sr\") pod \"ovnkube-node-6p6fc\" (UID: \"3a284a31-b4ea-4280-a6d1-b84390d1488d\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.149856 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.149831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8znnq\" (UniqueName: \"kubernetes.io/projected/d9c4a2fd-4534-495a-8f40-6d8faf8f87e6-kube-api-access-8znnq\") pod \"multus-additional-cni-plugins-xddzt\" (UID: \"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6\") " pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.149984 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.149959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vm4\" (UniqueName: \"kubernetes.io/projected/2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa-kube-api-access-p2vm4\") pod \"node-ca-jw8wp\" (UID: \"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa\") " pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.150641 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.150619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxd8k\" (UniqueName: \"kubernetes.io/projected/6a2e2e97-0696-4765-bf51-be31ec3c66ba-kube-api-access-dxd8k\") pod \"iptables-alerter-26xqh\" (UID: \"6a2e2e97-0696-4765-bf51-be31ec3c66ba\") " pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.151056 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.151018 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stvkw\" (UniqueName: \"kubernetes.io/projected/114ff0f2-95bc-49bb-be65-079af4d8294d-kube-api-access-stvkw\") pod \"multus-zrwkk\" (UID: \"114ff0f2-95bc-49bb-be65-079af4d8294d\") " pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.169750 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.169726 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:19.320301 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.320151 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-26xqh" Apr 21 03:57:19.327164 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.327140 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-65jhd" Apr 21 03:57:19.336139 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.336107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-26mzt" Apr 21 03:57:19.340823 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.340801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zrwkk" Apr 21 03:57:19.347412 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.347391 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jw8wp" Apr 21 03:57:19.354067 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.354031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xddzt" Apr 21 03:57:19.360681 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.360663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" Apr 21 03:57:19.369313 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.369294 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:19.373921 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.373902 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:19.643846 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.643783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:19.643979 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.643927 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:19.644021 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.643991 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:20.643970767 +0000 UTC m=+4.071742620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:19.689439 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.689346 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3ca347_33c6_4fbc_bc3d_a93f9f0c27fa.slice/crio-e7279937c0ea9f805c21577357b729e987241486a2e441ca650a4f865331cb1d WatchSource:0}: Error finding container e7279937c0ea9f805c21577357b729e987241486a2e441ca650a4f865331cb1d: Status 404 returned error can't find the container with id e7279937c0ea9f805c21577357b729e987241486a2e441ca650a4f865331cb1d Apr 21 03:57:19.690724 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.690701 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2e2e97_0696_4765_bf51_be31ec3c66ba.slice/crio-91663c9c36b5c966639dc501f158febf1ec53aa6ab75d2c3dabe23b01db8dae7 WatchSource:0}: Error finding container 91663c9c36b5c966639dc501f158febf1ec53aa6ab75d2c3dabe23b01db8dae7: Status 404 returned error can't find the container with id 91663c9c36b5c966639dc501f158febf1ec53aa6ab75d2c3dabe23b01db8dae7 Apr 21 03:57:19.694356 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.694336 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114ff0f2_95bc_49bb_be65_079af4d8294d.slice/crio-6cf5a0ff030bb2f5e1485781699dc8f73db49bab0afde589175471b0230f2bef WatchSource:0}: Error finding container 6cf5a0ff030bb2f5e1485781699dc8f73db49bab0afde589175471b0230f2bef: Status 404 returned error can't find the container with id 6cf5a0ff030bb2f5e1485781699dc8f73db49bab0afde589175471b0230f2bef Apr 21 03:57:19.695397 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.695364 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26270642_aaa5_4b43_804b_56317d766266.slice/crio-d0cdb5aff64c2cf1f6e4db100a9a801177778d3c4b6842f3e22bef62a08dbdb4 WatchSource:0}: Error finding container d0cdb5aff64c2cf1f6e4db100a9a801177778d3c4b6842f3e22bef62a08dbdb4: Status 404 returned error can't find the container with id d0cdb5aff64c2cf1f6e4db100a9a801177778d3c4b6842f3e22bef62a08dbdb4 Apr 21 03:57:19.696206 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.696184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8c6dad_8135_4d60_b437_56307544e064.slice/crio-7fa197250ba8fce907b81bf4d6b4be080faa8d4b1e165c2d277b182fae2f5a75 WatchSource:0}: Error finding container 7fa197250ba8fce907b81bf4d6b4be080faa8d4b1e165c2d277b182fae2f5a75: Status 404 returned error can't find the container with id 7fa197250ba8fce907b81bf4d6b4be080faa8d4b1e165c2d277b182fae2f5a75 Apr 21 03:57:19.698456 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.698346 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a284a31_b4ea_4280_a6d1_b84390d1488d.slice/crio-0133dd5dfc052b7a756176a7884ded5a60dba1ab4b8d073f27887be42b250c1f WatchSource:0}: Error finding container 0133dd5dfc052b7a756176a7884ded5a60dba1ab4b8d073f27887be42b250c1f: Status 404 returned error can't find the container with id 0133dd5dfc052b7a756176a7884ded5a60dba1ab4b8d073f27887be42b250c1f Apr 21 03:57:19.699545 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.699519 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec1512c_1a5f_42db_95f2_03f360fed0a5.slice/crio-e2806ca9a36534f735899f4cad2e88af3452c7166339bb19f86477f73f710227 WatchSource:0}: Error finding container e2806ca9a36534f735899f4cad2e88af3452c7166339bb19f86477f73f710227: Status 404 returned error can't find the container with id e2806ca9a36534f735899f4cad2e88af3452c7166339bb19f86477f73f710227 Apr 21 03:57:19.700481 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.700460 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c4a2fd_4534_495a_8f40_6d8faf8f87e6.slice/crio-8c3bf254f1487a68247a65552dfa727b15bcf0faf148b4b078014d520e7bc8cd WatchSource:0}: Error finding container 8c3bf254f1487a68247a65552dfa727b15bcf0faf148b4b078014d520e7bc8cd: Status 404 returned error can't find the container with id 8c3bf254f1487a68247a65552dfa727b15bcf0faf148b4b078014d520e7bc8cd Apr 21 03:57:19.701764 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:19.701667 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98b9edd_432f_4ffa_a024_8b9f651147e0.slice/crio-efeb7aaf3761e56e5ed30fd10372b3d565b123c86c75a602d4603f5732037d55 WatchSource:0}: Error finding container efeb7aaf3761e56e5ed30fd10372b3d565b123c86c75a602d4603f5732037d55: Status 404 returned error can't find the container with id efeb7aaf3761e56e5ed30fd10372b3d565b123c86c75a602d4603f5732037d55 Apr 21 03:57:19.744614 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:19.744591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:19.744714 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.744702 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:19.744779 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.744716 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:19.744779 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.744726 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:19.744865 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:19.744782 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:20.744762951 +0000 UTC m=+4.172534818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:20.062000 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.061641 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:18 +0000 UTC" deadline="2027-10-30 00:32:20.755240733 +0000 UTC" Apr 21 03:57:20.062000 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.061910 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13364h35m0.693338495s" Apr 21 03:57:20.167551 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.166832 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" event={"ID":"364f28bb57f65923dbeacedd6b253c36","Type":"ContainerStarted","Data":"f2b62852b2e65a764ac56146ed0fd8c00e84309318ad6d47cf1b9905b8fe05b0"} Apr 21 03:57:20.169482 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.169418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kmbjr" event={"ID":"c98b9edd-432f-4ffa-a024-8b9f651147e0","Type":"ContainerStarted","Data":"efeb7aaf3761e56e5ed30fd10372b3d565b123c86c75a602d4603f5732037d55"} Apr 21 03:57:20.174086 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.174060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"0133dd5dfc052b7a756176a7884ded5a60dba1ab4b8d073f27887be42b250c1f"} Apr 21 03:57:20.177668 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.177023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-26mzt" event={"ID":"8a8c6dad-8135-4d60-b437-56307544e064","Type":"ContainerStarted","Data":"7fa197250ba8fce907b81bf4d6b4be080faa8d4b1e165c2d277b182fae2f5a75"} Apr 21 03:57:20.182517 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.182492 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw8wp" event={"ID":"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa","Type":"ContainerStarted","Data":"e7279937c0ea9f805c21577357b729e987241486a2e441ca650a4f865331cb1d"} Apr 21 03:57:20.191865 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.191840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerStarted","Data":"8c3bf254f1487a68247a65552dfa727b15bcf0faf148b4b078014d520e7bc8cd"} Apr 21 03:57:20.193856 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.193810 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" event={"ID":"4ec1512c-1a5f-42db-95f2-03f360fed0a5","Type":"ContainerStarted","Data":"e2806ca9a36534f735899f4cad2e88af3452c7166339bb19f86477f73f710227"} Apr 21 03:57:20.195515 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.195466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65jhd" event={"ID":"26270642-aaa5-4b43-804b-56317d766266","Type":"ContainerStarted","Data":"d0cdb5aff64c2cf1f6e4db100a9a801177778d3c4b6842f3e22bef62a08dbdb4"} Apr 21 03:57:20.197195 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.197172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zrwkk" event={"ID":"114ff0f2-95bc-49bb-be65-079af4d8294d","Type":"ContainerStarted","Data":"6cf5a0ff030bb2f5e1485781699dc8f73db49bab0afde589175471b0230f2bef"} Apr 21 03:57:20.198744 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.198722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-26xqh" event={"ID":"6a2e2e97-0696-4765-bf51-be31ec3c66ba","Type":"ContainerStarted","Data":"91663c9c36b5c966639dc501f158febf1ec53aa6ab75d2c3dabe23b01db8dae7"} Apr 21 03:57:20.653344 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.653293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:20.653558 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.653538 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:20.653645 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.653631 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:22.653610256 +0000 UTC m=+6.081382123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:20.754026 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:20.753986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:20.754233 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.754211 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:20.754315 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.754239 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:20.754315 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.754253 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:20.754315 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:20.754312 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:22.754294037 +0000 UTC m=+6.182065896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:21.157083 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:21.156322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:21.157083 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:21.156447 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:21.157083 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:21.156923 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:21.157083 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:21.157027 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:21.227739 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:21.227099 2570 generic.go:358] "Generic (PLEG): container finished" podID="5861d034824f0ed241e7223fb92ee95e" containerID="0e9ec05fda3298da4f0779a26c4ed2cd22f7a5552ec4a3b40c77eed8484f07cc" exitCode=0 Apr 21 03:57:21.227739 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:21.227670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" event={"ID":"5861d034824f0ed241e7223fb92ee95e","Type":"ContainerDied","Data":"0e9ec05fda3298da4f0779a26c4ed2cd22f7a5552ec4a3b40c77eed8484f07cc"} Apr 21 03:57:21.241213 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:21.239890 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-93.ec2.internal" podStartSLOduration=3.2398748250000002 podStartE2EDuration="3.239874825s" podCreationTimestamp="2026-04-21 03:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:20.180127932 +0000 UTC m=+3.607899808" watchObservedRunningTime="2026-04-21 03:57:21.239874825 +0000 UTC m=+4.667646702" Apr 21 03:57:22.235745 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:22.235024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" event={"ID":"5861d034824f0ed241e7223fb92ee95e","Type":"ContainerStarted","Data":"903292f3ad7c69282892f78f5f880150a175a093f2963c7f877b8972c3388374"} Apr 21 03:57:22.673245 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:22.673154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:22.673480 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.673344 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:22.673480 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.673411 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:26.673392313 +0000 UTC m=+10.101164191 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:22.773966 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:22.773904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:22.774217 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.774083 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:22.774217 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.774103 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:22.774217 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.774116 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:22.774217 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:22.774179 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:26.774161523 +0000 UTC m=+10.201933381 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:23.157485 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:23.156911 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:23.157485 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:23.157056 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:23.157898 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:23.157756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:23.157898 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:23.157863 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:25.156252 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:25.156210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:25.156706 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:25.156347 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:25.156706 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:25.156428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:25.156706 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:25.156668 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:26.709172 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:26.709133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:26.709636 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.709309 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:26.709636 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.709376 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:34.709357969 +0000 UTC m=+18.137129824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:26.809892 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:26.809846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:26.810093 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.810071 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:26.810093 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.810092 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:26.810215 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.810105 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:26.810215 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:26.810167 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:34.810150028 +0000 UTC m=+18.237921887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:27.157535 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:27.157501 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:27.157712 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:27.157618 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:27.158159 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:27.157993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:27.158159 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:27.158115 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:29.156531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:29.156494 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:29.156975 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:29.156494 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:29.156975 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:29.156619 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:29.156975 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:29.156748 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:31.156204 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:31.156165 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:31.156611 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:31.156309 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:31.156611 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:31.156360 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:31.156611 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:31.156456 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:33.156247 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:33.156203 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:33.156698 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:33.156225 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:33.156698 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:33.156351 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:33.156698 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:33.156410 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:34.762232 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:34.762196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:34.762627 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.762348 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:34.762627 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.762412 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.762396578 +0000 UTC m=+34.190168432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:34.862732 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:34.862690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:34.862900 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.862867 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:34.862900 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.862887 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:34.862900 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.862899 2570 projected.go:194] Error preparing data for projected volume kube-api-access-c8tcf for pod openshift-network-diagnostics/network-check-target-ztjfh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:34.863017 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:34.862957 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf podName:be04fd1e-83bf-49d7-8c60-4323b986ab81 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.86294341 +0000 UTC m=+34.290715264 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c8tcf" (UniqueName: "kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf") pod "network-check-target-ztjfh" (UID: "be04fd1e-83bf-49d7-8c60-4323b986ab81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:35.156131 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:35.156030 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:35.156292 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:35.156173 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:35.156292 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:35.156234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:35.156406 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:35.156333 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:37.157338 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.157168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:37.157799 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.157236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:37.157799 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.157425 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:37.157799 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.157524 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:37.180594 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.180547 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-93.ec2.internal" podStartSLOduration=19.180529848 podStartE2EDuration="19.180529848s" podCreationTimestamp="2026-04-21 03:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:22.248781066 +0000 UTC m=+5.676552943" watchObservedRunningTime="2026-04-21 03:57:37.180529848 +0000 UTC m=+20.608301724" Apr 21 03:57:37.181323 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.181305 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m69zf"] Apr 21 03:57:37.192014 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.191998 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.192116 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.192077 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:37.260764 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.260658 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 03:57:37.261094 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.261074 2570 generic.go:358] "Generic (PLEG): container finished" podID="3a284a31-b4ea-4280-a6d1-b84390d1488d" containerID="2623eebdbd678664442c0a9d34c84705f74f44564cea6b4466b9df770d262414" exitCode=1 Apr 21 03:57:37.261160 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.261134 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"e3e3a2e50c6f2643826d9a47e42009a7ea48d924ef963e9d0c47b456b0551bfe"} Apr 21 03:57:37.261213 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.261157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"980ca9c0dd5a5609cfa2ed8078edc6356ec7e556adce14f927d3b294d59f0b1a"} Apr 21 03:57:37.261213 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.261172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerDied","Data":"2623eebdbd678664442c0a9d34c84705f74f44564cea6b4466b9df770d262414"} Apr 21 03:57:37.261213 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.261184 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"8724b2cf2fc373dd25e70f9c2a6ef6a0192e69d28099cffe933679b2c58c9102"} Apr 21 03:57:37.262493 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.262448 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-26mzt" event={"ID":"8a8c6dad-8135-4d60-b437-56307544e064","Type":"ContainerStarted","Data":"eb95d35be118935096b9d4d92a9def4fc1b1ef422000cf38b0712215d8336f40"} Apr 21 03:57:37.263778 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.263751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jw8wp" event={"ID":"2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa","Type":"ContainerStarted","Data":"dd9b8310b7f5f4897f361613135eef48620f3c44f377077372de27276a6ad88a"} Apr 21 03:57:37.265204 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.265179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerStarted","Data":"39b63e6bf9011dd1c93bfb93232a98fa6e74dcb8282cacc6873e9749e11c102c"} Apr 21 03:57:37.266527 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.266502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" event={"ID":"4ec1512c-1a5f-42db-95f2-03f360fed0a5","Type":"ContainerStarted","Data":"e7aed27ff6daa5489a04e63be24a869049b6e73dfecf70943882d1e6a39c58df"} Apr 21 03:57:37.267824 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.267801 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-65jhd" event={"ID":"26270642-aaa5-4b43-804b-56317d766266","Type":"ContainerStarted","Data":"19e17bf9c6b7e670f938e2101bb7ce30c2a1a2f2e6748fb41e9bd8c29be44faf"} Apr 21 03:57:37.269582 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.269558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zrwkk" event={"ID":"114ff0f2-95bc-49bb-be65-079af4d8294d","Type":"ContainerStarted","Data":"84d9a8a9a5f7873246b69a45bdabeb0d5973f73f7d76724a399bfbdad082c7ee"} Apr 21 03:57:37.271123 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.271098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kmbjr" event={"ID":"c98b9edd-432f-4ffa-a024-8b9f651147e0","Type":"ContainerStarted","Data":"8062a27dfe26800d44d0edd6aba1cc01572fa0bd4de801fb5f26efa7926fe218"} Apr 21 03:57:37.277479 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.277443 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-26mzt" podStartSLOduration=3.350666747 podStartE2EDuration="20.277432004s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.698179326 +0000 UTC m=+3.125951180" lastFinishedPulling="2026-04-21 03:57:36.62494457 +0000 UTC m=+20.052716437" observedRunningTime="2026-04-21 03:57:37.277092888 +0000 UTC m=+20.704864763" watchObservedRunningTime="2026-04-21 03:57:37.277432004 +0000 UTC m=+20.705203878" Apr 21 03:57:37.279118 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.279097 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.279190 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.279131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-kubelet-config\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.279273 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.279250 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-dbus\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.290889 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.290833 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zrwkk" podStartSLOduration=3.325791768 podStartE2EDuration="20.29081512s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.696298546 +0000 UTC m=+3.124070417" lastFinishedPulling="2026-04-21 03:57:36.661321913 +0000 UTC m=+20.089093769" observedRunningTime="2026-04-21 03:57:37.290756555 +0000 UTC m=+20.718528429" watchObservedRunningTime="2026-04-21 03:57:37.29081512 +0000 UTC m=+20.718586995" Apr 21 03:57:37.309586 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.309534 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jw8wp" podStartSLOduration=3.351024732 podStartE2EDuration="20.309515601s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.691229566 +0000 UTC m=+3.119001433" lastFinishedPulling="2026-04-21 03:57:36.649720443 +0000 UTC m=+20.077492302" observedRunningTime="2026-04-21 03:57:37.309007793 +0000 UTC m=+20.736779668" watchObservedRunningTime="2026-04-21 03:57:37.309515601 +0000 UTC m=+20.737287477" Apr 21 03:57:37.354113 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.354067 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-65jhd" podStartSLOduration=3.399772712 podStartE2EDuration="20.354053215s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.697029409 +0000 UTC m=+3.124801264" lastFinishedPulling="2026-04-21 03:57:36.651309901 +0000 UTC m=+20.079081767" observedRunningTime="2026-04-21 03:57:37.342193016 +0000 UTC m=+20.769964891" watchObservedRunningTime="2026-04-21 03:57:37.354053215 +0000 UTC m=+20.781825080" Apr 21 03:57:37.354653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.354626 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kmbjr" podStartSLOduration=11.432046615 podStartE2EDuration="20.354617276s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.703634467 +0000 UTC m=+3.131406321" lastFinishedPulling="2026-04-21 03:57:28.626205112 +0000 UTC m=+12.053976982" observedRunningTime="2026-04-21 03:57:37.354139832 +0000 UTC m=+20.781911709" watchObservedRunningTime="2026-04-21 03:57:37.354617276 +0000 UTC m=+20.782389150" Apr 21 03:57:37.380250 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.380225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.380531 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.380359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-kubelet-config\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.381419 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.381393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-dbus\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.381555 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.381539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-dbus\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.381640 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.381595 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/76716b30-cec0-4d8e-8c08-452dfeb18893-kubelet-config\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.382441 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.382418 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:37.382521 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.382478 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret podName:76716b30-cec0-4d8e-8c08-452dfeb18893 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:37.882459933 +0000 UTC m=+21.310231789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret") pod "global-pull-secret-syncer-m69zf" (UID: "76716b30-cec0-4d8e-8c08-452dfeb18893") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:37.885389 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:37.885357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:37.885531 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.885507 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:37.885584 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:37.885578 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret podName:76716b30-cec0-4d8e-8c08-452dfeb18893 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:38.885561865 +0000 UTC m=+22.313333722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret") pod "global-pull-secret-syncer-m69zf" (UID: "76716b30-cec0-4d8e-8c08-452dfeb18893") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:38.192680 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.192658 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:57:38.274647 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.274555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-26xqh" event={"ID":"6a2e2e97-0696-4765-bf51-be31ec3c66ba","Type":"ContainerStarted","Data":"81dcd5860820c65ceacd36f318d10252e1d2e8a85759247cb25e67d8917c1daf"} Apr 21 03:57:38.276926 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.276905 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 03:57:38.277281 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.277263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"f5713c1a4e635dc119483387544d0a6b9dc08d14563444b9e8ceb9e759f62443"} Apr 21 03:57:38.277339 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.277289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"30048dcf3b06aac3a19e957801543777d3d0333bc335e4c693c49f112943c4ab"} Apr 21 03:57:38.278550 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.278528 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="39b63e6bf9011dd1c93bfb93232a98fa6e74dcb8282cacc6873e9749e11c102c" exitCode=0 Apr 21 03:57:38.278611 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.278590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"39b63e6bf9011dd1c93bfb93232a98fa6e74dcb8282cacc6873e9749e11c102c"} Apr 21 03:57:38.280232 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.280209 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" event={"ID":"4ec1512c-1a5f-42db-95f2-03f360fed0a5","Type":"ContainerStarted","Data":"c63b16cc9c88bbf2d6a00ee1026b30b802346d7fd881aa273ae48a90701d6c98"} Apr 21 03:57:38.300213 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.300168 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-26xqh" podStartSLOduration=4.344343788 podStartE2EDuration="21.300155388s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.69347993 +0000 UTC m=+3.121251783" lastFinishedPulling="2026-04-21 03:57:36.649291513 +0000 UTC m=+20.077063383" observedRunningTime="2026-04-21 03:57:38.286151568 +0000 UTC m=+21.713923443" watchObservedRunningTime="2026-04-21 03:57:38.300155388 +0000 UTC m=+21.727927263" Apr 21 03:57:38.893209 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:38.893175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:38.893373 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:38.893318 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:38.893424 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:38.893394 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret podName:76716b30-cec0-4d8e-8c08-452dfeb18893 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:40.893374487 +0000 UTC m=+24.321146342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret") pod "global-pull-secret-syncer-m69zf" (UID: "76716b30-cec0-4d8e-8c08-452dfeb18893") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:39.098196 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.098082 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:57:38.192675124Z","UUID":"1abafb55-3bd6-4454-a869-56289c5e34bf","Handler":null,"Name":"","Endpoint":""} Apr 21 03:57:39.101439 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.101415 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:57:39.101439 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.101445 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:57:39.156261 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.156182 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:39.156261 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.156214 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:39.156467 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:39.156182 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:39.156467 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:39.156287 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:39.156467 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:39.156410 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:39.156607 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:39.156482 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:40.285870 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:40.285611 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" event={"ID":"4ec1512c-1a5f-42db-95f2-03f360fed0a5","Type":"ContainerStarted","Data":"e79e0cb3ddcf51b7f5d2fafee4ae58c2c1a17febc3ee7eb5c3ebdcb519474e85"} Apr 21 03:57:40.288384 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:40.288356 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 03:57:40.288724 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:40.288690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"e607c00702fc9c2824f73b3cfe2a526d8cd9585b3673e084ecd4a4ea369d0603"} Apr 21 03:57:40.298466 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:40.298416 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jrpm5" podStartSLOduration=3.820778195 podStartE2EDuration="23.298404506s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.702206416 +0000 UTC m=+3.129978272" lastFinishedPulling="2026-04-21 03:57:39.179832727 +0000 UTC m=+22.607604583" observedRunningTime="2026-04-21 03:57:40.298272567 +0000 UTC m=+23.726044443" watchObservedRunningTime="2026-04-21 03:57:40.298404506 +0000 UTC m=+23.726176384" Apr 21 03:57:40.907083 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:40.907033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:40.907281 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:40.907203 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:40.907342 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:40.907282 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret podName:76716b30-cec0-4d8e-8c08-452dfeb18893 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:44.907262097 +0000 UTC m=+28.335033968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret") pod "global-pull-secret-syncer-m69zf" (UID: "76716b30-cec0-4d8e-8c08-452dfeb18893") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:41.156537 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:41.156499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:41.156710 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:41.156499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:41.156710 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:41.156636 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:41.156710 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:41.156499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:41.156845 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:41.156728 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:41.156845 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:41.156796 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:42.035260 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:42.035188 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:42.035913 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:42.035893 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:42.292697 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:42.292643 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:42.293256 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:42.293241 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kmbjr" Apr 21 03:57:43.156891 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.156711 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:43.156891 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.156731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:43.157715 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.156731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:43.157715 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:43.157435 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:43.157715 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:43.157675 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:43.160120 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:43.158762 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:43.296588 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.296561 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 03:57:43.296855 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.296830 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"9688e3590a1d8ebe7e85d7d4381066bbb54f2035788bc9bdb70a0cbf176d6e8b"} Apr 21 03:57:43.297141 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.297122 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:43.297350 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.297332 2570 scope.go:117] "RemoveContainer" containerID="2623eebdbd678664442c0a9d34c84705f74f44564cea6b4466b9df770d262414" Apr 21 03:57:43.298594 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.298562 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="9474ec66466604de1769a253b58a0d7327f121bff2bb8dff23ac3dff42be40b7" exitCode=0 Apr 21 03:57:43.298696 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.298596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"9474ec66466604de1769a253b58a0d7327f121bff2bb8dff23ac3dff42be40b7"} Apr 21 03:57:43.312493 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:43.312473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:44.239615 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.239543 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m69zf"] Apr 21 03:57:44.240065 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.239646 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:44.240065 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:44.239721 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:44.242496 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.242471 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ztjfh"] Apr 21 03:57:44.242605 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.242566 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:44.242650 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:44.242639 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:44.242972 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.242950 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lf2dl"] Apr 21 03:57:44.243057 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.243032 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:44.243138 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:44.243122 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:44.302070 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.302024 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="9d5e7959b9e01fc522ed40f824622b60c47359c81c2f692a2d7c06edeaaa3d9c" exitCode=0 Apr 21 03:57:44.302227 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.302107 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"9d5e7959b9e01fc522ed40f824622b60c47359c81c2f692a2d7c06edeaaa3d9c"} Apr 21 03:57:44.308225 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.308207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 03:57:44.308566 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.308546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" event={"ID":"3a284a31-b4ea-4280-a6d1-b84390d1488d","Type":"ContainerStarted","Data":"c923e8f8ec9724d936d2a4ffe18cf19e2428acf21d8a2fb2725baf87f9084127"} Apr 21 03:57:44.308969 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.308951 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:44.309031 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.308977 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:44.322878 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.322860 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:57:44.346377 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.346336 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" podStartSLOduration=10.352772646 podStartE2EDuration="27.346324694s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.701233549 +0000 UTC m=+3.129005404" lastFinishedPulling="2026-04-21 03:57:36.694785585 +0000 UTC m=+20.122557452" observedRunningTime="2026-04-21 03:57:44.346272785 +0000 UTC m=+27.774044660" watchObservedRunningTime="2026-04-21 03:57:44.346324694 +0000 UTC m=+27.774096552" Apr 21 03:57:44.939543 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:44.939510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:44.939684 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:44.939617 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:44.939684 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:44.939664 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret podName:76716b30-cec0-4d8e-8c08-452dfeb18893 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:52.93965129 +0000 UTC m=+36.367423143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret") pod "global-pull-secret-syncer-m69zf" (UID: "76716b30-cec0-4d8e-8c08-452dfeb18893") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:45.312335 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:45.312239 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="a19cdf114f4294be35068aa4cab43e0d0ac76f231e1b781165288f5dbe2a73a7" exitCode=0 Apr 21 03:57:45.312335 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:45.312317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"a19cdf114f4294be35068aa4cab43e0d0ac76f231e1b781165288f5dbe2a73a7"} Apr 21 03:57:46.156365 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:46.156291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:46.156509 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:46.156297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:46.156509 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:46.156296 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:46.156509 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:46.156462 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:46.156659 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:46.156588 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:46.156710 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:46.156657 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:48.156465 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:48.156224 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:48.156465 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:48.156225 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:48.156943 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:48.156507 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ztjfh" podUID="be04fd1e-83bf-49d7-8c60-4323b986ab81" Apr 21 03:57:48.156943 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:48.156569 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:57:48.156943 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:48.156236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:48.157136 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:48.156991 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m69zf" podUID="76716b30-cec0-4d8e-8c08-452dfeb18893" Apr 21 03:57:49.910399 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.910320 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-93.ec2.internal" event="NodeReady" Apr 21 03:57:49.911062 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.910471 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:57:49.939165 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.939133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cctsl"] Apr 21 03:57:49.943718 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.943696 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 03:57:49.943877 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.943859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:49.945968 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.945941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 03:57:49.945968 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.945962 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hvdwx\"" Apr 21 03:57:49.946126 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.945967 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 03:57:49.946950 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.946931 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:49.948607 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.948589 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cctsl"] Apr 21 03:57:49.948874 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.948857 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 03:57:49.949402 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.949383 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 03:57:49.949672 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.949632 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 03:57:49.949752 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.949685 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fzhgd\"" Apr 21 03:57:49.955698 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.955667 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 03:57:49.962398 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.962380 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 03:57:49.962623 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.962599 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-879kk"] Apr 21 03:57:49.969092 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.969069 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lcnh4"] Apr 21 03:57:49.969253 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.969234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-879kk" Apr 21 03:57:49.971443 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.971425 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:57:49.971568 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.971549 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 03:57:49.971670 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.971527 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:57:49.972662 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.972641 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:49.974220 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.974194 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-879kk"] Apr 21 03:57:49.974886 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.974842 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:57:49.975018 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.974993 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 03:57:49.975159 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.975120 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:57:49.975230 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.975215 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:57:49.978022 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:49.977905 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lcnh4"] Apr 21 03:57:50.080221 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2g2\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080398 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080398 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.080398 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080306 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080551 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080395 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080551 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080437 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080551 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080485 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.080551 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080712 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f709de-df2d-4000-add8-f6588eda3b15-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.080712 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd9ab38-f364-44df-8d61-4e3ba0946953-tmp-dir\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.080712 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080610 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvs4c\" (UniqueName: \"kubernetes.io/projected/6fd9ab38-f364-44df-8d61-4e3ba0946953-kube-api-access-zvs4c\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.080712 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fd9ab38-f364-44df-8d61-4e3ba0946953-config-volume\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.080880 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bhq\" (UniqueName: \"kubernetes.io/projected/e78af541-e68e-436e-8f47-28080dca3c2d-kube-api-access-s9bhq\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.080880 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.080880 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080823 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.080992 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.080882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.156609 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.156538 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:50.156914 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.156683 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:50.156914 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.156712 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:50.159595 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159536 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:57:50.159725 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159660 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 03:57:50.159725 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 03:57:50.159837 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159762 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:57:50.159875 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159833 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:57:50.159915 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.159884 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sjhjf\"" Apr 21 03:57:50.181224 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2g2\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.181393 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f709de-df2d-4000-add8-f6588eda3b15-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd9ab38-f364-44df-8d61-4e3ba0946953-tmp-dir\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvs4c\" (UniqueName: \"kubernetes.io/projected/6fd9ab38-f364-44df-8d61-4e3ba0946953-kube-api-access-zvs4c\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fd9ab38-f364-44df-8d61-4e3ba0946953-config-volume\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bhq\" (UniqueName: \"kubernetes.io/projected/e78af541-e68e-436e-8f47-28080dca3c2d-kube-api-access-s9bhq\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181528 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.181567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.181593 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.181664 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.681644613 +0000 UTC m=+34.109416471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:57:50.181675 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.181678 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:50.182083 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.181691 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:57:50.182083 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.181750 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.681739274 +0000 UTC m=+34.109511127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:57:50.182168 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.182082 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:50.182168 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.182158 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.682143097 +0000 UTC m=+34.109914950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:57:50.183316 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.182516 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.183316 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.182794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd9ab38-f364-44df-8d61-4e3ba0946953-tmp-dir\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.183316 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.182870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fd9ab38-f364-44df-8d61-4e3ba0946953-config-volume\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.183316 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.183279 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:50.183578 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.183323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.183578 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.183334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:50.683319315 +0000 UTC m=+34.111091184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:57:50.183578 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.183346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.183578 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.183424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f709de-df2d-4000-add8-f6588eda3b15-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.186609 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.186586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.186609 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.186598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.190398 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.190373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.190505 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.190457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2g2\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.190587 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.190568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bhq\" (UniqueName: \"kubernetes.io/projected/e78af541-e68e-436e-8f47-28080dca3c2d-kube-api-access-s9bhq\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.190830 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.190808 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvs4c\" (UniqueName: \"kubernetes.io/projected/6fd9ab38-f364-44df-8d61-4e3ba0946953-kube-api-access-zvs4c\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.685357 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.685323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:50.685510 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.685366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:50.685510 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.685423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:50.685510 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.685468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:50.685510 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685477 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685539 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685551 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685557 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685554 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685552 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:51.685532315 +0000 UTC m=+35.113304185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685603 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:57:51.685592537 +0000 UTC m=+35.113364390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685617 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:51.685609338 +0000 UTC m=+35.113381198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:57:50.685693 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.685637 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:51.685629442 +0000 UTC m=+35.113401295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:57:50.786298 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.786261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:57:50.786468 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.786362 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:57:50.786468 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:50.786412 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:22.786399967 +0000 UTC m=+66.214171821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : secret "metrics-daemon-secret" not found Apr 21 03:57:50.887587 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.887550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:50.889992 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:50.889963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8tcf\" (UniqueName: \"kubernetes.io/projected/be04fd1e-83bf-49d7-8c60-4323b986ab81-kube-api-access-c8tcf\") pod \"network-check-target-ztjfh\" (UID: \"be04fd1e-83bf-49d7-8c60-4323b986ab81\") " pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:51.069591 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.069505 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:51.252222 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.252082 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ztjfh"] Apr 21 03:57:51.255731 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:51.255708 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe04fd1e_83bf_49d7_8c60_4323b986ab81.slice/crio-5e3c75f9c538f361f3a8d1d526c33cdc77fa134f9af32add8fab73c1bc8a026f WatchSource:0}: Error finding container 5e3c75f9c538f361f3a8d1d526c33cdc77fa134f9af32add8fab73c1bc8a026f: Status 404 returned error can't find the container with id 5e3c75f9c538f361f3a8d1d526c33cdc77fa134f9af32add8fab73c1bc8a026f Apr 21 03:57:51.325556 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.325486 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ztjfh" event={"ID":"be04fd1e-83bf-49d7-8c60-4323b986ab81","Type":"ContainerStarted","Data":"5e3c75f9c538f361f3a8d1d526c33cdc77fa134f9af32add8fab73c1bc8a026f"} Apr 21 03:57:51.327908 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.327881 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerStarted","Data":"d718926158c85ba66d01d7e2f67553dce315c16c4f22fcbd1fe50880c289ee72"} Apr 21 03:57:51.694120 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.694023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:51.694120 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.694096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.694158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:51.694191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694191 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694251 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694299 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.69428154 +0000 UTC m=+37.122053403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694297 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.694307414 +0000 UTC m=+37.122079268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:57:51.694344 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694316 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:57:51.694656 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694366 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.694353255 +0000 UTC m=+37.122125109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:57:51.694656 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694240 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:51.694656 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:51.694413 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.694403017 +0000 UTC m=+37.122174871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:57:52.332634 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:52.332597 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="d718926158c85ba66d01d7e2f67553dce315c16c4f22fcbd1fe50880c289ee72" exitCode=0 Apr 21 03:57:52.333002 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:52.332646 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"d718926158c85ba66d01d7e2f67553dce315c16c4f22fcbd1fe50880c289ee72"} Apr 21 03:57:52.984108 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:52.984077 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr"] Apr 21 03:57:53.004992 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.004964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:53.007190 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.007145 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw"] Apr 21 03:57:53.007317 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.007255 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.009151 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.009125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/76716b30-cec0-4d8e-8c08-452dfeb18893-original-pull-secret\") pod \"global-pull-secret-syncer-m69zf\" (UID: \"76716b30-cec0-4d8e-8c08-452dfeb18893\") " pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:53.009821 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.009799 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 03:57:53.010201 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.010183 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 03:57:53.010442 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.010421 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 03:57:53.010590 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.010572 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tj2lg\"" Apr 21 03:57:53.010798 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.010780 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 03:57:53.015980 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.015960 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr"] Apr 21 03:57:53.016105 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.015989 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw"] Apr 21 03:57:53.016168 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.016104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.018591 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.018382 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 03:57:53.018591 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.018391 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 03:57:53.018591 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.018444 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 03:57:53.018591 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.018493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 03:57:53.025413 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.025392 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9"] Apr 21 03:57:53.040220 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.040197 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9"] Apr 21 03:57:53.040332 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.040321 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.042453 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.042434 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 03:57:53.106328 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.106477 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.106477 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a491ae-8363-4a48-8ded-779e5c0cb064-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.106477 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.106477 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjmw\" (UniqueName: \"kubernetes.io/projected/12a491ae-8363-4a48-8ded-779e5c0cb064-kube-api-access-qkjmw\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.106665 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6j2k\" (UniqueName: \"kubernetes.io/projected/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-kube-api-access-l6j2k\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.106665 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106520 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.106665 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.106609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.183539 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.183508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m69zf" Apr 21 03:57:53.207549 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.207710 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.207710 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.207822 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207708 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.207822 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.207822 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.207803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6j2k\" (UniqueName: \"kubernetes.io/projected/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-kube-api-access-l6j2k\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.208329 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.208170 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5tgw\" (UniqueName: \"kubernetes.io/projected/84abcc8f-012c-49d2-bd91-5947a2abcc8a-kube-api-access-t5tgw\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.208329 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.208256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a491ae-8363-4a48-8ded-779e5c0cb064-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.208329 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.208293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84abcc8f-012c-49d2-bd91-5947a2abcc8a-klusterlet-config\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.209890 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.209846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjmw\" (UniqueName: \"kubernetes.io/projected/12a491ae-8363-4a48-8ded-779e5c0cb064-kube-api-access-qkjmw\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.210002 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.209964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84abcc8f-012c-49d2-bd91-5947a2abcc8a-tmp\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.210700 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.210668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a491ae-8363-4a48-8ded-779e5c0cb064-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.211473 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.211085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.211473 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.211104 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-hub\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.211473 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.211352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.211660 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.211473 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-ca\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.214193 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.213079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a491ae-8363-4a48-8ded-779e5c0cb064-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.217634 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.217611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjmw\" (UniqueName: \"kubernetes.io/projected/12a491ae-8363-4a48-8ded-779e5c0cb064-kube-api-access-qkjmw\") pod \"cluster-proxy-proxy-agent-68599484b9-vwwzw\" (UID: \"12a491ae-8363-4a48-8ded-779e5c0cb064\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.217928 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.217906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6j2k\" (UniqueName: \"kubernetes.io/projected/d4713f9e-6e12-483d-a2ee-70e9f784ce0b-kube-api-access-l6j2k\") pod \"managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr\" (UID: \"d4713f9e-6e12-483d-a2ee-70e9f784ce0b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.311208 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.311132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5tgw\" (UniqueName: \"kubernetes.io/projected/84abcc8f-012c-49d2-bd91-5947a2abcc8a-kube-api-access-t5tgw\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.311208 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.311205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84abcc8f-012c-49d2-bd91-5947a2abcc8a-klusterlet-config\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.311410 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.311256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84abcc8f-012c-49d2-bd91-5947a2abcc8a-tmp\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.316450 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.316421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84abcc8f-012c-49d2-bd91-5947a2abcc8a-tmp\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.316793 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.316766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/84abcc8f-012c-49d2-bd91-5947a2abcc8a-klusterlet-config\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.318931 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.318907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5tgw\" (UniqueName: \"kubernetes.io/projected/84abcc8f-012c-49d2-bd91-5947a2abcc8a-kube-api-access-t5tgw\") pod \"klusterlet-addon-workmgr-58cc59f686-kn2l9\" (UID: \"84abcc8f-012c-49d2-bd91-5947a2abcc8a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.332366 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.332345 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 03:57:53.337470 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.337442 2570 generic.go:358] "Generic (PLEG): container finished" podID="d9c4a2fd-4534-495a-8f40-6d8faf8f87e6" containerID="24da4771520f96ce77a2e18827ef0904c15189dce0426529d53e57c3f09877a2" exitCode=0 Apr 21 03:57:53.337829 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.337496 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerDied","Data":"24da4771520f96ce77a2e18827ef0904c15189dce0426529d53e57c3f09877a2"} Apr 21 03:57:53.337829 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.337745 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 03:57:53.362528 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.362498 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:57:53.715009 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.714925 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:53.715009 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.714979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:53.715009 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.715015 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715112 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715131 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715131 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715131 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715180 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.715165746 +0000 UTC m=+41.142937599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715204 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.715186511 +0000 UTC m=+41.142958364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715226 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.715217505 +0000 UTC m=+41.142989359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:57:53.715250 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:53.715246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:53.715586 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715375 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:53.715586 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:53.715421 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.715406905 +0000 UTC m=+41.143178762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:57:54.117657 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.117445 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw"] Apr 21 03:57:54.127867 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.127844 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr"] Apr 21 03:57:54.131319 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.131296 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m69zf"] Apr 21 03:57:54.132125 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.132104 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9"] Apr 21 03:57:54.275561 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:54.275486 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12a491ae_8363_4a48_8ded_779e5c0cb064.slice/crio-a87a5e3d84e02950350f892aafe79965cc3dd7298c5d8147490d1d77b34b47a8 WatchSource:0}: Error finding container a87a5e3d84e02950350f892aafe79965cc3dd7298c5d8147490d1d77b34b47a8: Status 404 returned error can't find the container with id a87a5e3d84e02950350f892aafe79965cc3dd7298c5d8147490d1d77b34b47a8 Apr 21 03:57:54.275894 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:54.275852 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4713f9e_6e12_483d_a2ee_70e9f784ce0b.slice/crio-03e34e3b73088e4dec91c0cd398d865e2004edf4e61e96ef28689df8973bb310 WatchSource:0}: Error finding container 03e34e3b73088e4dec91c0cd398d865e2004edf4e61e96ef28689df8973bb310: Status 404 returned error can't find the container with id 03e34e3b73088e4dec91c0cd398d865e2004edf4e61e96ef28689df8973bb310 Apr 21 03:57:54.276749 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:54.276727 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84abcc8f_012c_49d2_bd91_5947a2abcc8a.slice/crio-94c6dfdc4b9b6ff4de33f91e7af6ffefa0d4c8d4ab750db51b11675e4a3d1073 WatchSource:0}: Error finding container 94c6dfdc4b9b6ff4de33f91e7af6ffefa0d4c8d4ab750db51b11675e4a3d1073: Status 404 returned error can't find the container with id 94c6dfdc4b9b6ff4de33f91e7af6ffefa0d4c8d4ab750db51b11675e4a3d1073 Apr 21 03:57:54.277557 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:57:54.277539 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76716b30_cec0_4d8e_8c08_452dfeb18893.slice/crio-ceba8cb492d4ffae78b4ee90bb107d137d3f36d17b72ad24f8f34c867d333f6b WatchSource:0}: Error finding container ceba8cb492d4ffae78b4ee90bb107d137d3f36d17b72ad24f8f34c867d333f6b: Status 404 returned error can't find the container with id ceba8cb492d4ffae78b4ee90bb107d137d3f36d17b72ad24f8f34c867d333f6b Apr 21 03:57:54.340917 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.340892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xddzt" event={"ID":"d9c4a2fd-4534-495a-8f40-6d8faf8f87e6","Type":"ContainerStarted","Data":"e7d5d793a38641e6acc6099c75cee8a2d93c39176d90df8702d98ecdec33ccf6"} Apr 21 03:57:54.341697 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.341676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerStarted","Data":"a87a5e3d84e02950350f892aafe79965cc3dd7298c5d8147490d1d77b34b47a8"} Apr 21 03:57:54.342497 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.342481 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" event={"ID":"d4713f9e-6e12-483d-a2ee-70e9f784ce0b","Type":"ContainerStarted","Data":"03e34e3b73088e4dec91c0cd398d865e2004edf4e61e96ef28689df8973bb310"} Apr 21 03:57:54.343294 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.343277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" event={"ID":"84abcc8f-012c-49d2-bd91-5947a2abcc8a","Type":"ContainerStarted","Data":"94c6dfdc4b9b6ff4de33f91e7af6ffefa0d4c8d4ab750db51b11675e4a3d1073"} Apr 21 03:57:54.344153 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.344135 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m69zf" event={"ID":"76716b30-cec0-4d8e-8c08-452dfeb18893","Type":"ContainerStarted","Data":"ceba8cb492d4ffae78b4ee90bb107d137d3f36d17b72ad24f8f34c867d333f6b"} Apr 21 03:57:54.378681 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:54.378477 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xddzt" podStartSLOduration=5.979907441 podStartE2EDuration="37.378458651s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:19.70318379 +0000 UTC m=+3.130955657" lastFinishedPulling="2026-04-21 03:57:51.101735015 +0000 UTC m=+34.529506867" observedRunningTime="2026-04-21 03:57:54.377208214 +0000 UTC m=+37.804980097" watchObservedRunningTime="2026-04-21 03:57:54.378458651 +0000 UTC m=+37.806230526" Apr 21 03:57:55.357835 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:55.357775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ztjfh" event={"ID":"be04fd1e-83bf-49d7-8c60-4323b986ab81","Type":"ContainerStarted","Data":"56a4fd8bfa25f08f0697cffc9ff59449b1897705e5d98d338b56c5e169307fad"} Apr 21 03:57:55.358450 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:55.357843 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:57:55.375902 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:55.375158 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ztjfh" podStartSLOduration=35.303646587 podStartE2EDuration="38.375138357s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 03:57:51.257558641 +0000 UTC m=+34.685330495" lastFinishedPulling="2026-04-21 03:57:54.329050412 +0000 UTC m=+37.756822265" observedRunningTime="2026-04-21 03:57:55.37428632 +0000 UTC m=+38.802058197" watchObservedRunningTime="2026-04-21 03:57:55.375138357 +0000 UTC m=+38.802910233" Apr 21 03:57:57.755484 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:57.755440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:57.755554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:57.755599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755606 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755743 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:57:57.755635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755795 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.755777691 +0000 UTC m=+49.183549548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755817 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.755806919 +0000 UTC m=+49.183578772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755912 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755929 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:57:57.756075 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.755993 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.755976506 +0000 UTC m=+49.183748360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:57:57.756557 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.756103 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:57:57.756557 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:57:57.756161 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.756140403 +0000 UTC m=+49.183912274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:58:03.375021 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.374955 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" event={"ID":"d4713f9e-6e12-483d-a2ee-70e9f784ce0b","Type":"ContainerStarted","Data":"50df1184faa547467c70caa5a0a743a76387827444ca20fb79b4cf59fbaad025"} Apr 21 03:58:03.376478 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.376440 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" event={"ID":"84abcc8f-012c-49d2-bd91-5947a2abcc8a","Type":"ContainerStarted","Data":"62600b73569811b21cdc2f18998f12c1e6988c69fdabfbe380091ef850337ef5"} Apr 21 03:58:03.376671 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.376640 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:58:03.378098 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.377941 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m69zf" event={"ID":"76716b30-cec0-4d8e-8c08-452dfeb18893","Type":"ContainerStarted","Data":"150fb4b0002be4b588648321e8a717f814ec99c30f50697abb816d403d88592f"} Apr 21 03:58:03.378763 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.378742 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 03:58:03.379569 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.379548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerStarted","Data":"f9e17f504cdb03e8b8bf01b6a01bb7f0c8adeceda9a80e4eed8eb6a7ec811649"} Apr 21 03:58:03.389945 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.389889 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" podStartSLOduration=3.360183181 podStartE2EDuration="11.389878252s" podCreationTimestamp="2026-04-21 03:57:52 +0000 UTC" firstStartedPulling="2026-04-21 03:57:54.316433999 +0000 UTC m=+37.744205858" lastFinishedPulling="2026-04-21 03:58:02.346129062 +0000 UTC m=+45.773900929" observedRunningTime="2026-04-21 03:58:03.389113297 +0000 UTC m=+46.816885172" watchObservedRunningTime="2026-04-21 03:58:03.389878252 +0000 UTC m=+46.817650121" Apr 21 03:58:03.401682 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.401640 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m69zf" podStartSLOduration=18.360834938 podStartE2EDuration="26.401624466s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:54.31666973 +0000 UTC m=+37.744441589" lastFinishedPulling="2026-04-21 03:58:02.357459265 +0000 UTC m=+45.785231117" observedRunningTime="2026-04-21 03:58:03.401470612 +0000 UTC m=+46.829242487" watchObservedRunningTime="2026-04-21 03:58:03.401624466 +0000 UTC m=+46.829396341" Apr 21 03:58:03.416184 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:03.416149 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" podStartSLOduration=2.386289221 podStartE2EDuration="10.416136513s" podCreationTimestamp="2026-04-21 03:57:53 +0000 UTC" firstStartedPulling="2026-04-21 03:57:54.316286339 +0000 UTC m=+37.744058207" lastFinishedPulling="2026-04-21 03:58:02.346133632 +0000 UTC m=+45.773905499" observedRunningTime="2026-04-21 03:58:03.415714428 +0000 UTC m=+46.843486304" watchObservedRunningTime="2026-04-21 03:58:03.416136513 +0000 UTC m=+46.843908389" Apr 21 03:58:05.387326 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.387289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerStarted","Data":"56add889d9b39bb1d38e758f714eddd3997341dbdc686625e80593e607d25bed"} Apr 21 03:58:05.387326 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.387330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerStarted","Data":"3fd1efc8b94a249d230909eba6bfc90ddbdfb89bb32f47a716a793634744be64"} Apr 21 03:58:05.408401 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.408359 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" podStartSLOduration=2.869196358 podStartE2EDuration="13.408348427s" podCreationTimestamp="2026-04-21 03:57:52 +0000 UTC" firstStartedPulling="2026-04-21 03:57:54.316342634 +0000 UTC m=+37.744114501" lastFinishedPulling="2026-04-21 03:58:04.855494717 +0000 UTC m=+48.283266570" observedRunningTime="2026-04-21 03:58:05.407227554 +0000 UTC m=+48.834999433" watchObservedRunningTime="2026-04-21 03:58:05.408348427 +0000 UTC m=+48.836120307" Apr 21 03:58:05.825413 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.825322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:58:05.825413 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.825376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:58:05.825413 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.825398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:05.825421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825472 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825499 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825521 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825527 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825568 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825562 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:21.825541718 +0000 UTC m=+65.253313590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825597 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:21.82558715 +0000 UTC m=+65.253359003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:58:05.825615 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825610 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:58:21.825603363 +0000 UTC m=+65.253375215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:58:05.825894 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:05.825631 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:21.825623321 +0000 UTC m=+65.253395174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:58:16.325601 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:16.325574 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p6fc" Apr 21 03:58:21.841841 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:21.841807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:58:21.841841 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:21.841853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:21.841879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:21.841906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.841966 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.841987 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.841993 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.841971 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.842023 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.842051 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:53.842023193 +0000 UTC m=+97.269795046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.842126 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:53.842106905 +0000 UTC m=+97.269878759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.842143 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:53.842132215 +0000 UTC m=+97.269904076 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:58:21.842399 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:21.842167 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:58:53.84215644 +0000 UTC m=+97.269928299 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:58:22.848891 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:22.848847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:58:22.849338 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:22.849023 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:58:22.849338 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:22.849106 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:26.849089689 +0000 UTC m=+130.276861542 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : secret "metrics-daemon-secret" not found Apr 21 03:58:27.364772 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:27.364735 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ztjfh" Apr 21 03:58:53.890166 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:53.890133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:58:53.890166 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:53.890175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:53.890199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890279 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890281 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890297 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:58:53.890321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890331 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert podName:e78af541-e68e-436e-8f47-28080dca3c2d nodeName:}" failed. No retries permitted until 2026-04-21 03:59:57.890318955 +0000 UTC m=+161.318090807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert") pod "ingress-canary-lcnh4" (UID: "e78af541-e68e-436e-8f47-28080dca3c2d") : secret "canary-serving-cert" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890302 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f9667fbb5-8hs2h: secret "image-registry-tls" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890363 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert podName:97f709de-df2d-4000-add8-f6588eda3b15 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:57.890348612 +0000 UTC m=+161.318120464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-cctsl" (UID: "97f709de-df2d-4000-add8-f6588eda3b15") : secret "networking-console-plugin-cert" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890415 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890420 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls podName:3c847fe8-2248-42f1-9774-c95b562f9b61 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:57.890403942 +0000 UTC m=+161.318175812 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls") pod "image-registry-f9667fbb5-8hs2h" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61") : secret "image-registry-tls" not found Apr 21 03:58:53.890701 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:58:53.890460 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls podName:6fd9ab38-f364-44df-8d61-4e3ba0946953 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:57.89044592 +0000 UTC m=+161.318217787 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls") pod "dns-default-879kk" (UID: "6fd9ab38-f364-44df-8d61-4e3ba0946953") : secret "dns-default-metrics-tls" not found Apr 21 03:59:26.939626 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:26.939587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 03:59:26.940131 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:26.939738 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:59:26.940131 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:26.939805 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs podName:0642b1aa-ff76-4694-bad0-be2656b81005 nodeName:}" failed. No retries permitted until 2026-04-21 04:01:28.939789788 +0000 UTC m=+252.367561644 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs") pod "network-metrics-daemon-lf2dl" (UID: "0642b1aa-ff76-4694-bad0-be2656b81005") : secret "metrics-daemon-secret" not found Apr 21 03:59:28.519601 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:28.519575 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-26mzt_8a8c6dad-8135-4d60-b437-56307544e064/dns-node-resolver/0.log" Apr 21 03:59:29.719421 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:29.719396 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jw8wp_2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa/node-ca/0.log" Apr 21 03:59:52.959389 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:52.959338 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" podUID="97f709de-df2d-4000-add8-f6588eda3b15" Apr 21 03:59:52.966542 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:52.966515 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" Apr 21 03:59:52.981672 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:52.981637 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-879kk" podUID="6fd9ab38-f364-44df-8d61-4e3ba0946953" Apr 21 03:59:52.987777 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:52.987755 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lcnh4" podUID="e78af541-e68e-436e-8f47-28080dca3c2d" Apr 21 03:59:53.178013 ip-10-0-131-93 kubenswrapper[2570]: E0421 03:59:53.177963 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lf2dl" podUID="0642b1aa-ff76-4694-bad0-be2656b81005" Apr 21 03:59:53.641341 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:53.641312 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:59:53.641507 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:53.641312 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:59:53.641507 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:53.641312 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:59:53.641587 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:53.641325 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-879kk" Apr 21 03:59:57.982189 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.982152 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:59:57.982653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.982200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:59:57.982653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.982456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:59:57.982653 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.982523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:59:57.985321 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.985293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"image-registry-f9667fbb5-8hs2h\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:59:57.985425 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.985295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fd9ab38-f364-44df-8d61-4e3ba0946953-metrics-tls\") pod \"dns-default-879kk\" (UID: \"6fd9ab38-f364-44df-8d61-4e3ba0946953\") " pod="openshift-dns/dns-default-879kk" Apr 21 03:59:57.985425 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.985331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e78af541-e68e-436e-8f47-28080dca3c2d-cert\") pod \"ingress-canary-lcnh4\" (UID: \"e78af541-e68e-436e-8f47-28080dca3c2d\") " pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:59:57.985425 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:57.985411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/97f709de-df2d-4000-add8-f6588eda3b15-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-cctsl\" (UID: \"97f709de-df2d-4000-add8-f6588eda3b15\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:59:58.145543 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.145509 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 03:59:58.145543 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.145534 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hvdwx\"" Apr 21 03:59:58.145830 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.145534 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 03:59:58.145830 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.145540 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fzhgd\"" Apr 21 03:59:58.152705 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.152689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" Apr 21 03:59:58.152750 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.152725 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:59:58.152825 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.152696 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lcnh4" Apr 21 03:59:58.152895 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.152882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-879kk" Apr 21 03:59:58.330675 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.330648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-879kk"] Apr 21 03:59:58.334473 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:59:58.334436 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd9ab38_f364_44df_8d61_4e3ba0946953.slice/crio-f2f215fa2bd46297530b17d45f0a10ff8464d888d9c236c6d4bf2948b78fe4a6 WatchSource:0}: Error finding container f2f215fa2bd46297530b17d45f0a10ff8464d888d9c236c6d4bf2948b78fe4a6: Status 404 returned error can't find the container with id f2f215fa2bd46297530b17d45f0a10ff8464d888d9c236c6d4bf2948b78fe4a6 Apr 21 03:59:58.350898 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.350873 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 03:59:58.354247 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:59:58.354221 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c847fe8_2248_42f1_9774_c95b562f9b61.slice/crio-c109980ba61aea2346c0befd70a59bf911c7eb6fd021da08a27706173d8b4d06 WatchSource:0}: Error finding container c109980ba61aea2346c0befd70a59bf911c7eb6fd021da08a27706173d8b4d06: Status 404 returned error can't find the container with id c109980ba61aea2346c0befd70a59bf911c7eb6fd021da08a27706173d8b4d06 Apr 21 03:59:58.569642 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.569571 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lcnh4"] Apr 21 03:59:58.572279 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.572255 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-cctsl"] Apr 21 03:59:58.572928 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:59:58.572906 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode78af541_e68e_436e_8f47_28080dca3c2d.slice/crio-a8be510ffc6e832ba8b70a8729650ad1f99115836c51b5caa6c8b9608ad1e513 WatchSource:0}: Error finding container a8be510ffc6e832ba8b70a8729650ad1f99115836c51b5caa6c8b9608ad1e513: Status 404 returned error can't find the container with id a8be510ffc6e832ba8b70a8729650ad1f99115836c51b5caa6c8b9608ad1e513 Apr 21 03:59:58.575121 ip-10-0-131-93 kubenswrapper[2570]: W0421 03:59:58.575083 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f709de_df2d_4000_add8_f6588eda3b15.slice/crio-21aeff3a8b5fabc126fce9dd1c98d258916699d8247901bd080266db54be5070 WatchSource:0}: Error finding container 21aeff3a8b5fabc126fce9dd1c98d258916699d8247901bd080266db54be5070: Status 404 returned error can't find the container with id 21aeff3a8b5fabc126fce9dd1c98d258916699d8247901bd080266db54be5070 Apr 21 03:59:58.653150 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.653116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" event={"ID":"3c847fe8-2248-42f1-9774-c95b562f9b61","Type":"ContainerStarted","Data":"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb"} Apr 21 03:59:58.653150 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.653155 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" event={"ID":"3c847fe8-2248-42f1-9774-c95b562f9b61","Type":"ContainerStarted","Data":"c109980ba61aea2346c0befd70a59bf911c7eb6fd021da08a27706173d8b4d06"} Apr 21 03:59:58.653376 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.653256 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 03:59:58.654277 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.654249 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" event={"ID":"97f709de-df2d-4000-add8-f6588eda3b15","Type":"ContainerStarted","Data":"21aeff3a8b5fabc126fce9dd1c98d258916699d8247901bd080266db54be5070"} Apr 21 03:59:58.655338 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.655306 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lcnh4" event={"ID":"e78af541-e68e-436e-8f47-28080dca3c2d","Type":"ContainerStarted","Data":"a8be510ffc6e832ba8b70a8729650ad1f99115836c51b5caa6c8b9608ad1e513"} Apr 21 03:59:58.656250 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.656226 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-879kk" event={"ID":"6fd9ab38-f364-44df-8d61-4e3ba0946953","Type":"ContainerStarted","Data":"f2f215fa2bd46297530b17d45f0a10ff8464d888d9c236c6d4bf2948b78fe4a6"} Apr 21 03:59:58.673980 ip-10-0-131-93 kubenswrapper[2570]: I0421 03:59:58.673937 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" podStartSLOduration=161.673922418 podStartE2EDuration="2m41.673922418s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:59:58.673180094 +0000 UTC m=+162.100951973" watchObservedRunningTime="2026-04-21 03:59:58.673922418 +0000 UTC m=+162.101694283" Apr 21 04:00:00.235125 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.235093 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dhkjb"] Apr 21 04:00:00.238189 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.238168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.241420 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.241396 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:00:00.241546 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.241396 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:00:00.241730 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.241717 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tgq7l\"" Apr 21 04:00:00.241950 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.241934 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:00:00.242063 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.242048 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:00:00.255131 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.255106 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhkjb"] Apr 21 04:00:00.303628 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.303607 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6921503-4d28-40e8-ad65-db03edc30976-crio-socket\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.303734 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.303640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6921503-4d28-40e8-ad65-db03edc30976-data-volume\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.303734 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.303672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6921503-4d28-40e8-ad65-db03edc30976-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.303734 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.303698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6921503-4d28-40e8-ad65-db03edc30976-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.303734 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.303726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx4nt\" (UniqueName: \"kubernetes.io/projected/a6921503-4d28-40e8-ad65-db03edc30976-kube-api-access-fx4nt\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404306 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404266 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6921503-4d28-40e8-ad65-db03edc30976-crio-socket\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404306 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6921503-4d28-40e8-ad65-db03edc30976-data-volume\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404520 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404324 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6921503-4d28-40e8-ad65-db03edc30976-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404520 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6921503-4d28-40e8-ad65-db03edc30976-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404520 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4nt\" (UniqueName: \"kubernetes.io/projected/a6921503-4d28-40e8-ad65-db03edc30976-kube-api-access-fx4nt\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404520 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404383 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6921503-4d28-40e8-ad65-db03edc30976-crio-socket\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404666 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6921503-4d28-40e8-ad65-db03edc30976-data-volume\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.404906 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.404890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6921503-4d28-40e8-ad65-db03edc30976-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.406644 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.406626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6921503-4d28-40e8-ad65-db03edc30976-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.411912 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.411890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4nt\" (UniqueName: \"kubernetes.io/projected/a6921503-4d28-40e8-ad65-db03edc30976-kube-api-access-fx4nt\") pod \"insights-runtime-extractor-dhkjb\" (UID: \"a6921503-4d28-40e8-ad65-db03edc30976\") " pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.546893 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.546854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhkjb" Apr 21 04:00:00.664359 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.664311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-879kk" event={"ID":"6fd9ab38-f364-44df-8d61-4e3ba0946953","Type":"ContainerStarted","Data":"4e959890d1bee3b977f513581773467bf960baee7be44c3bb01a1bc0e6950fb0"} Apr 21 04:00:00.664359 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.664355 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-879kk" event={"ID":"6fd9ab38-f364-44df-8d61-4e3ba0946953","Type":"ContainerStarted","Data":"0177e81559fec4974e38a88d5f65434a96355e5c593fd89bd0f2a0109113ec30"} Apr 21 04:00:00.664597 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.664416 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-879kk" Apr 21 04:00:00.665609 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.665565 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" event={"ID":"97f709de-df2d-4000-add8-f6588eda3b15","Type":"ContainerStarted","Data":"c0d27ba23c63a72546f2209e38007d3f265e409e226703425bf060f25664b878"} Apr 21 04:00:00.666757 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.666740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lcnh4" event={"ID":"e78af541-e68e-436e-8f47-28080dca3c2d","Type":"ContainerStarted","Data":"a8afa8a14d7aede845ec06313f9682ef41fc57e1f99da80bb33ebb7a4fe0aa49"} Apr 21 04:00:00.673879 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.673839 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhkjb"] Apr 21 04:00:00.677313 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:00:00.677294 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6921503_4d28_40e8_ad65_db03edc30976.slice/crio-86210d9913d8744925ebdc798e55bc65d50c8fff92378b5e44f2e0a53707cef3 WatchSource:0}: Error finding container 86210d9913d8744925ebdc798e55bc65d50c8fff92378b5e44f2e0a53707cef3: Status 404 returned error can't find the container with id 86210d9913d8744925ebdc798e55bc65d50c8fff92378b5e44f2e0a53707cef3 Apr 21 04:00:00.688967 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.688926 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-879kk" podStartSLOduration=129.727563831 podStartE2EDuration="2m11.688913237s" podCreationTimestamp="2026-04-21 03:57:49 +0000 UTC" firstStartedPulling="2026-04-21 03:59:58.336543049 +0000 UTC m=+161.764314902" lastFinishedPulling="2026-04-21 04:00:00.297892439 +0000 UTC m=+163.725664308" observedRunningTime="2026-04-21 04:00:00.688214983 +0000 UTC m=+164.115986856" watchObservedRunningTime="2026-04-21 04:00:00.688913237 +0000 UTC m=+164.116685127" Apr 21 04:00:00.706080 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.702836 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-cctsl" podStartSLOduration=160.982742093 podStartE2EDuration="2m42.702617436s" podCreationTimestamp="2026-04-21 03:57:18 +0000 UTC" firstStartedPulling="2026-04-21 03:59:58.577019251 +0000 UTC m=+162.004791118" lastFinishedPulling="2026-04-21 04:00:00.296894601 +0000 UTC m=+163.724666461" observedRunningTime="2026-04-21 04:00:00.70188732 +0000 UTC m=+164.129659196" watchObservedRunningTime="2026-04-21 04:00:00.702617436 +0000 UTC m=+164.130389302" Apr 21 04:00:00.717378 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:00.717326 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lcnh4" podStartSLOduration=129.991092533 podStartE2EDuration="2m11.717306911s" podCreationTimestamp="2026-04-21 03:57:49 +0000 UTC" firstStartedPulling="2026-04-21 03:59:58.575535535 +0000 UTC m=+162.003307394" lastFinishedPulling="2026-04-21 04:00:00.301749914 +0000 UTC m=+163.729521772" observedRunningTime="2026-04-21 04:00:00.716232858 +0000 UTC m=+164.144004734" watchObservedRunningTime="2026-04-21 04:00:00.717306911 +0000 UTC m=+164.145078787" Apr 21 04:00:01.670722 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:01.670685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhkjb" event={"ID":"a6921503-4d28-40e8-ad65-db03edc30976","Type":"ContainerStarted","Data":"469ebf2f24cb784aec6b7c0bd1c94a7b2308dc079be1af69f1e067d5ed7abde4"} Apr 21 04:00:01.670722 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:01.670724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhkjb" event={"ID":"a6921503-4d28-40e8-ad65-db03edc30976","Type":"ContainerStarted","Data":"4ea9b83afff41c93061d1f61c6f28a8e721bd6dfe2a21cb6c0249fdfd6f53595"} Apr 21 04:00:01.671161 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:01.670733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhkjb" event={"ID":"a6921503-4d28-40e8-ad65-db03edc30976","Type":"ContainerStarted","Data":"86210d9913d8744925ebdc798e55bc65d50c8fff92378b5e44f2e0a53707cef3"} Apr 21 04:00:02.675643 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.675607 2570 generic.go:358] "Generic (PLEG): container finished" podID="84abcc8f-012c-49d2-bd91-5947a2abcc8a" containerID="62600b73569811b21cdc2f18998f12c1e6988c69fdabfbe380091ef850337ef5" exitCode=1 Apr 21 04:00:02.676118 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.675689 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" event={"ID":"84abcc8f-012c-49d2-bd91-5947a2abcc8a","Type":"ContainerDied","Data":"62600b73569811b21cdc2f18998f12c1e6988c69fdabfbe380091ef850337ef5"} Apr 21 04:00:02.676174 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.676118 2570 scope.go:117] "RemoveContainer" containerID="62600b73569811b21cdc2f18998f12c1e6988c69fdabfbe380091ef850337ef5" Apr 21 04:00:02.677313 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.677291 2570 generic.go:358] "Generic (PLEG): container finished" podID="d4713f9e-6e12-483d-a2ee-70e9f784ce0b" containerID="50df1184faa547467c70caa5a0a743a76387827444ca20fb79b4cf59fbaad025" exitCode=255 Apr 21 04:00:02.677418 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.677338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" event={"ID":"d4713f9e-6e12-483d-a2ee-70e9f784ce0b","Type":"ContainerDied","Data":"50df1184faa547467c70caa5a0a743a76387827444ca20fb79b4cf59fbaad025"} Apr 21 04:00:02.677630 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:02.677614 2570 scope.go:117] "RemoveContainer" containerID="50df1184faa547467c70caa5a0a743a76387827444ca20fb79b4cf59fbaad025" Apr 21 04:00:03.333546 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.333472 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" Apr 21 04:00:03.363501 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.363473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 04:00:03.376608 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.376590 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 04:00:03.681303 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.681270 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9b95b8b9c-n4dlr" event={"ID":"d4713f9e-6e12-483d-a2ee-70e9f784ce0b","Type":"ContainerStarted","Data":"fc02c710eea9368da406c4f09ac4e99d17ce70378a5c977bffeaae0e8b7f9b41"} Apr 21 04:00:03.682858 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.682836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhkjb" event={"ID":"a6921503-4d28-40e8-ad65-db03edc30976","Type":"ContainerStarted","Data":"fd6b24d119b6b439ae3337fda1727ada8be636570d3124cdf4d67a3de344d9d1"} Apr 21 04:00:03.684269 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.684249 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" event={"ID":"84abcc8f-012c-49d2-bd91-5947a2abcc8a","Type":"ContainerStarted","Data":"42ca379917d3f1ddd675cae4032907911c93f20e2e6a3530eb94c8195ba0dd59"} Apr 21 04:00:03.684452 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.684437 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 04:00:03.685734 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.685716 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58cc59f686-kn2l9" Apr 21 04:00:03.718848 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:03.718805 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dhkjb" podStartSLOduration=1.379888357 podStartE2EDuration="3.718793629s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:00.73426352 +0000 UTC m=+164.162035374" lastFinishedPulling="2026-04-21 04:00:03.07316879 +0000 UTC m=+166.500940646" observedRunningTime="2026-04-21 04:00:03.71805523 +0000 UTC m=+167.145827101" watchObservedRunningTime="2026-04-21 04:00:03.718793629 +0000 UTC m=+167.146565504" Apr 21 04:00:08.056161 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.056080 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6xcz4"] Apr 21 04:00:08.059099 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.059080 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.061346 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.061318 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:00:08.061506 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.061358 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d559f\"" Apr 21 04:00:08.061506 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.061399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:00:08.061506 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.061426 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:00:08.062353 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.062332 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:00:08.062452 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.062376 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:00:08.062525 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.062507 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:00:08.156691 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.156665 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 04:00:08.160006 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.159985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-textfile\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160110 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-wtmp\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160110 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160183 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-root\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160183 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160144 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbnb\" (UniqueName: \"kubernetes.io/projected/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-kube-api-access-9qbnb\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160183 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160287 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-metrics-client-ca\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160287 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160259 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.160287 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.160279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-sys\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261286 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-textfile\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261455 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261294 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-wtmp\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261455 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261455 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-root\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261455 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbnb\" (UniqueName: \"kubernetes.io/projected/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-kube-api-access-9qbnb\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-root\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:00:08.261543 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-metrics-client-ca\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:00:08.261612 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls podName:e88ce4e9-847b-4070-a0fe-fb0df7a4e988 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:08.761593769 +0000 UTC m=+172.189365645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls") pod "node-exporter-6xcz4" (UID: "e88ce4e9-847b-4070-a0fe-fb0df7a4e988") : secret "node-exporter-tls" not found Apr 21 04:00:08.261667 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261968 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-sys\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.261968 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-sys\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.262139 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.262120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-textfile\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.262196 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.262127 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-metrics-client-ca\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.262196 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.261429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-wtmp\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.262297 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.262243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.264341 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.264317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.269989 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.269964 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbnb\" (UniqueName: \"kubernetes.io/projected/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-kube-api-access-9qbnb\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.766105 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:08.766076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:08.766303 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:00:08.766280 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.766382 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:00:08.766371 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls podName:e88ce4e9-847b-4070-a0fe-fb0df7a4e988 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:09.766348366 +0000 UTC m=+173.194120220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls") pod "node-exporter-6xcz4" (UID: "e88ce4e9-847b-4070-a0fe-fb0df7a4e988") : secret "node-exporter-tls" not found Apr 21 04:00:09.773994 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:09.773949 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:09.776205 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:09.776184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e88ce4e9-847b-4070-a0fe-fb0df7a4e988-node-exporter-tls\") pod \"node-exporter-6xcz4\" (UID: \"e88ce4e9-847b-4070-a0fe-fb0df7a4e988\") " pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:09.867871 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:09.867836 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6xcz4" Apr 21 04:00:09.875960 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:00:09.875935 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode88ce4e9_847b_4070_a0fe_fb0df7a4e988.slice/crio-2dc6acc895a193725082947158bf748178a767858246d9d6d1f188adfdcc8341 WatchSource:0}: Error finding container 2dc6acc895a193725082947158bf748178a767858246d9d6d1f188adfdcc8341: Status 404 returned error can't find the container with id 2dc6acc895a193725082947158bf748178a767858246d9d6d1f188adfdcc8341 Apr 21 04:00:10.673596 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:10.673576 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-879kk" Apr 21 04:00:10.704108 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:10.704081 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xcz4" event={"ID":"e88ce4e9-847b-4070-a0fe-fb0df7a4e988","Type":"ContainerStarted","Data":"419a53f46b4fa6cce73352e5a520675b6a02b14805cfb615501462f53613b931"} Apr 21 04:00:10.704108 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:10.704111 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xcz4" event={"ID":"e88ce4e9-847b-4070-a0fe-fb0df7a4e988","Type":"ContainerStarted","Data":"2dc6acc895a193725082947158bf748178a767858246d9d6d1f188adfdcc8341"} Apr 21 04:00:11.708303 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:11.708273 2570 generic.go:358] "Generic (PLEG): container finished" podID="e88ce4e9-847b-4070-a0fe-fb0df7a4e988" containerID="419a53f46b4fa6cce73352e5a520675b6a02b14805cfb615501462f53613b931" exitCode=0 Apr 21 04:00:11.708677 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:11.708323 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xcz4" event={"ID":"e88ce4e9-847b-4070-a0fe-fb0df7a4e988","Type":"ContainerDied","Data":"419a53f46b4fa6cce73352e5a520675b6a02b14805cfb615501462f53613b931"} Apr 21 04:00:12.712762 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:12.712727 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xcz4" event={"ID":"e88ce4e9-847b-4070-a0fe-fb0df7a4e988","Type":"ContainerStarted","Data":"5543a07eca3b727d6dfa1a8913872204ca342dcb4152ba679a8bc7144f6ac1e6"} Apr 21 04:00:12.712762 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:12.712765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xcz4" event={"ID":"e88ce4e9-847b-4070-a0fe-fb0df7a4e988","Type":"ContainerStarted","Data":"b632144d7021e6b1eb82e67f6af4043827f5895c84419bedc7b78cd34b4aecc4"} Apr 21 04:00:12.736635 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:12.736591 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6xcz4" podStartSLOduration=3.9804709750000002 podStartE2EDuration="4.736577855s" podCreationTimestamp="2026-04-21 04:00:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:09.877606934 +0000 UTC m=+173.305378790" lastFinishedPulling="2026-04-21 04:00:10.633713816 +0000 UTC m=+174.061485670" observedRunningTime="2026-04-21 04:00:12.735414525 +0000 UTC m=+176.163186400" watchObservedRunningTime="2026-04-21 04:00:12.736577855 +0000 UTC m=+176.164349758" Apr 21 04:00:18.156621 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:18.156577 2570 patch_prober.go:28] interesting pod/image-registry-f9667fbb5-8hs2h container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 04:00:18.157004 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:18.156639 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:00:19.664261 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:19.664231 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 04:00:21.968333 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:21.968291 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 04:00:46.990889 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:46.990829 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerName="registry" containerID="cri-o://025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb" gracePeriod=30 Apr 21 04:00:47.239456 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.239430 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 04:00:47.269815 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.269738 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.269815 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.269773 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.269820 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj2g2\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.269984 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.270022 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270228 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.270096 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270228 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.270137 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270228 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.270166 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets\") pod \"3c847fe8-2248-42f1-9774-c95b562f9b61\" (UID: \"3c847fe8-2248-42f1-9774-c95b562f9b61\") " Apr 21 04:00:47.270905 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.270821 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:47.271291 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.271253 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:47.272729 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.272678 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:47.272947 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.272923 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:47.273121 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.273095 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2" (OuterVolumeSpecName: "kube-api-access-qj2g2") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "kube-api-access-qj2g2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:47.273441 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.273388 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:47.273525 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.273475 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:47.279068 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.279026 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3c847fe8-2248-42f1-9774-c95b562f9b61" (UID: "3c847fe8-2248-42f1-9774-c95b562f9b61"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:47.371114 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371072 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-certificates\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371114 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371112 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-installation-pull-secrets\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371130 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-bound-sa-token\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371147 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3c847fe8-2248-42f1-9774-c95b562f9b61-image-registry-private-configuration\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371162 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qj2g2\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-kube-api-access-qj2g2\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371178 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c847fe8-2248-42f1-9774-c95b562f9b61-registry-tls\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371192 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c847fe8-2248-42f1-9774-c95b562f9b61-ca-trust-extracted\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.371292 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.371206 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c847fe8-2248-42f1-9774-c95b562f9b61-trusted-ca\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:00:47.802866 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.802835 2570 generic.go:358] "Generic (PLEG): container finished" podID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerID="025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb" exitCode=0 Apr 21 04:00:47.803054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.802897 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" Apr 21 04:00:47.803054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.802905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" event={"ID":"3c847fe8-2248-42f1-9774-c95b562f9b61","Type":"ContainerDied","Data":"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb"} Apr 21 04:00:47.803054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.802932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f9667fbb5-8hs2h" event={"ID":"3c847fe8-2248-42f1-9774-c95b562f9b61","Type":"ContainerDied","Data":"c109980ba61aea2346c0befd70a59bf911c7eb6fd021da08a27706173d8b4d06"} Apr 21 04:00:47.803054 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.802946 2570 scope.go:117] "RemoveContainer" containerID="025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb" Apr 21 04:00:47.811138 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.811119 2570 scope.go:117] "RemoveContainer" containerID="025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb" Apr 21 04:00:47.811364 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:00:47.811348 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb\": container with ID starting with 025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb not found: ID does not exist" containerID="025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb" Apr 21 04:00:47.811410 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.811372 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb"} err="failed to get container status \"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb\": rpc error: code = NotFound desc = could not find container \"025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb\": container with ID starting with 025c20a41f06b2bec5c044e2926e854d12f3e22f6aa65d9a07d1d6859ac593fb not found: ID does not exist" Apr 21 04:00:47.823925 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.823898 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 04:00:47.830358 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:47.830337 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f9667fbb5-8hs2h"] Apr 21 04:00:49.160125 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:00:49.160089 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" path="/var/lib/kubelet/pods/3c847fe8-2248-42f1-9774-c95b562f9b61/volumes" Apr 21 04:01:03.339252 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:03.339205 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" podUID="12a491ae-8363-4a48-8ded-779e5c0cb064" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:01:13.339473 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:13.339432 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" podUID="12a491ae-8363-4a48-8ded-779e5c0cb064" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:01:23.338996 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.338954 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" podUID="12a491ae-8363-4a48-8ded-779e5c0cb064" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:01:23.339378 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.339050 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" Apr 21 04:01:23.339509 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.339491 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"56add889d9b39bb1d38e758f714eddd3997341dbdc686625e80593e607d25bed"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 04:01:23.339549 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.339530 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" podUID="12a491ae-8363-4a48-8ded-779e5c0cb064" containerName="service-proxy" containerID="cri-o://56add889d9b39bb1d38e758f714eddd3997341dbdc686625e80593e607d25bed" gracePeriod=30 Apr 21 04:01:23.903604 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.903573 2570 generic.go:358] "Generic (PLEG): container finished" podID="12a491ae-8363-4a48-8ded-779e5c0cb064" containerID="56add889d9b39bb1d38e758f714eddd3997341dbdc686625e80593e607d25bed" exitCode=2 Apr 21 04:01:23.903782 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.903643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerDied","Data":"56add889d9b39bb1d38e758f714eddd3997341dbdc686625e80593e607d25bed"} Apr 21 04:01:23.903782 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:23.903682 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68599484b9-vwwzw" event={"ID":"12a491ae-8363-4a48-8ded-779e5c0cb064","Type":"ContainerStarted","Data":"2dc5db4e1fa8060fa6fbedeb6cdf8b39e2aabd21cf71624eea27771de2e206ae"} Apr 21 04:01:28.996910 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:28.996867 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 04:01:28.999183 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:28.999157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0642b1aa-ff76-4694-bad0-be2656b81005-metrics-certs\") pod \"network-metrics-daemon-lf2dl\" (UID: \"0642b1aa-ff76-4694-bad0-be2656b81005\") " pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 04:01:29.160290 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:29.160262 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 04:01:29.168158 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:29.168138 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lf2dl" Apr 21 04:01:29.282102 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:29.282015 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lf2dl"] Apr 21 04:01:29.285763 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:01:29.285735 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0642b1aa_ff76_4694_bad0_be2656b81005.slice/crio-9e680a7e8f9fc87b5dd7b5da5d0d2b22a956a97a465bcb31a676231205c28062 WatchSource:0}: Error finding container 9e680a7e8f9fc87b5dd7b5da5d0d2b22a956a97a465bcb31a676231205c28062: Status 404 returned error can't find the container with id 9e680a7e8f9fc87b5dd7b5da5d0d2b22a956a97a465bcb31a676231205c28062 Apr 21 04:01:29.919593 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:29.919555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lf2dl" event={"ID":"0642b1aa-ff76-4694-bad0-be2656b81005","Type":"ContainerStarted","Data":"9e680a7e8f9fc87b5dd7b5da5d0d2b22a956a97a465bcb31a676231205c28062"} Apr 21 04:01:30.923830 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:30.923797 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lf2dl" event={"ID":"0642b1aa-ff76-4694-bad0-be2656b81005","Type":"ContainerStarted","Data":"067e5a0b5d4e6afa1c80249b499eec3371aeb0b0fa413aeae9b01a8041b7954e"} Apr 21 04:01:30.923830 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:30.923832 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lf2dl" event={"ID":"0642b1aa-ff76-4694-bad0-be2656b81005","Type":"ContainerStarted","Data":"dc75e7081e1052e4d54bf0418c8b13354174d69dc95ee95bda71e760d7cc08d2"} Apr 21 04:01:30.938126 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:01:30.938078 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lf2dl" podStartSLOduration=252.981187981 podStartE2EDuration="4m13.938064154s" podCreationTimestamp="2026-04-21 03:57:17 +0000 UTC" firstStartedPulling="2026-04-21 04:01:29.287510634 +0000 UTC m=+252.715282492" lastFinishedPulling="2026-04-21 04:01:30.244386802 +0000 UTC m=+253.672158665" observedRunningTime="2026-04-21 04:01:30.937105469 +0000 UTC m=+254.364877335" watchObservedRunningTime="2026-04-21 04:01:30.938064154 +0000 UTC m=+254.365836029" Apr 21 04:02:17.032924 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:02:17.032895 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:02:17.033524 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:02:17.033508 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:02:17.037100 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:02:17.037076 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:05:24.165942 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.165906 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2"] Apr 21 04:05:24.166397 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.166163 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerName="registry" Apr 21 04:05:24.166397 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.166187 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerName="registry" Apr 21 04:05:24.166397 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.166233 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c847fe8-2248-42f1-9774-c95b562f9b61" containerName="registry" Apr 21 04:05:24.167900 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.167885 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.170358 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.170329 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-2gf4j\"" Apr 21 04:05:24.170498 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.170329 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:05:24.170498 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.170413 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 04:05:24.182820 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.182797 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2"] Apr 21 04:05:24.282154 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.282114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.282154 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.282155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb2lz\" (UniqueName: \"kubernetes.io/projected/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-kube-api-access-sb2lz\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.383161 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.383125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.383161 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.383167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb2lz\" (UniqueName: \"kubernetes.io/projected/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-kube-api-access-sb2lz\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.383511 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.383490 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.392836 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.392802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb2lz\" (UniqueName: \"kubernetes.io/projected/5d7f8749-5ecc-4e25-9fa7-6127be5911bb-kube-api-access-sb2lz\") pod \"cert-manager-operator-controller-manager-54b9655956-4zvk2\" (UID: \"5d7f8749-5ecc-4e25-9fa7-6127be5911bb\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.476803 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.476711 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" Apr 21 04:05:24.597855 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.597823 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2"] Apr 21 04:05:24.601324 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:05:24.601291 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7f8749_5ecc_4e25_9fa7_6127be5911bb.slice/crio-e52b8330033ac21f24651ae2a9f687cafe88092fa5ca1dfc1690352c89f71319 WatchSource:0}: Error finding container e52b8330033ac21f24651ae2a9f687cafe88092fa5ca1dfc1690352c89f71319: Status 404 returned error can't find the container with id e52b8330033ac21f24651ae2a9f687cafe88092fa5ca1dfc1690352c89f71319 Apr 21 04:05:24.603739 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:24.603721 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:05:25.510427 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:25.510373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" event={"ID":"5d7f8749-5ecc-4e25-9fa7-6127be5911bb","Type":"ContainerStarted","Data":"e52b8330033ac21f24651ae2a9f687cafe88092fa5ca1dfc1690352c89f71319"} Apr 21 04:05:27.518458 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:27.518422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" event={"ID":"5d7f8749-5ecc-4e25-9fa7-6127be5911bb","Type":"ContainerStarted","Data":"d02c8630f1a89a53edc6e0a35f84af0abc6096628282c6f6f0c8b4d9e97e3104"} Apr 21 04:05:27.537659 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:27.537609 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-4zvk2" podStartSLOduration=1.173643136 podStartE2EDuration="3.537594198s" podCreationTimestamp="2026-04-21 04:05:24 +0000 UTC" firstStartedPulling="2026-04-21 04:05:24.603903767 +0000 UTC m=+488.031675627" lastFinishedPulling="2026-04-21 04:05:26.967854835 +0000 UTC m=+490.395626689" observedRunningTime="2026-04-21 04:05:27.535838384 +0000 UTC m=+490.963610261" watchObservedRunningTime="2026-04-21 04:05:27.537594198 +0000 UTC m=+490.965366072" Apr 21 04:05:35.375418 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.375382 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5mnx7"] Apr 21 04:05:35.377520 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.377503 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.379999 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.379981 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:05:35.380155 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.380055 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:05:35.380317 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.380301 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-vz785\"" Apr 21 04:05:35.386096 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.386075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5mnx7"] Apr 21 04:05:35.462441 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.462411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.462621 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.462452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497w4\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-kube-api-access-497w4\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.563198 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.563167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-497w4\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-kube-api-access-497w4\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.563362 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.563223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.570934 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.570900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.571050 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.571016 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-497w4\" (UniqueName: \"kubernetes.io/projected/f57259b1-7959-4b48-bd97-33f7abf47ee5-kube-api-access-497w4\") pod \"cert-manager-webhook-587ccfb98-5mnx7\" (UID: \"f57259b1-7959-4b48-bd97-33f7abf47ee5\") " pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.686764 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.686668 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:35.800057 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:35.800010 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-5mnx7"] Apr 21 04:05:35.802929 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:05:35.802897 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57259b1_7959_4b48_bd97_33f7abf47ee5.slice/crio-79c89c58324e3dec33c6880bb6b5c76b67e185f02423d8b3701beb14ad9be2d2 WatchSource:0}: Error finding container 79c89c58324e3dec33c6880bb6b5c76b67e185f02423d8b3701beb14ad9be2d2: Status 404 returned error can't find the container with id 79c89c58324e3dec33c6880bb6b5c76b67e185f02423d8b3701beb14ad9be2d2 Apr 21 04:05:36.544375 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:36.544338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" event={"ID":"f57259b1-7959-4b48-bd97-33f7abf47ee5","Type":"ContainerStarted","Data":"79c89c58324e3dec33c6880bb6b5c76b67e185f02423d8b3701beb14ad9be2d2"} Apr 21 04:05:38.551675 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:38.551637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" event={"ID":"f57259b1-7959-4b48-bd97-33f7abf47ee5","Type":"ContainerStarted","Data":"9c51a842476e6b2abb32f92af4769d3afa291ee91b25d91e7e06f7a643906705"} Apr 21 04:05:38.552154 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:38.551693 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:44.556695 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:44.556658 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" Apr 21 04:05:44.571988 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:44.571855 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-5mnx7" podStartSLOduration=7.5264524 podStartE2EDuration="9.571840828s" podCreationTimestamp="2026-04-21 04:05:35 +0000 UTC" firstStartedPulling="2026-04-21 04:05:35.804708321 +0000 UTC m=+499.232480175" lastFinishedPulling="2026-04-21 04:05:37.850096745 +0000 UTC m=+501.277868603" observedRunningTime="2026-04-21 04:05:38.56834263 +0000 UTC m=+501.996114506" watchObservedRunningTime="2026-04-21 04:05:44.571840828 +0000 UTC m=+507.999612699" Apr 21 04:05:46.645402 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.645367 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z"] Apr 21 04:05:46.647466 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.647450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.649674 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.649653 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:05:46.650541 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.650526 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:05:46.650614 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.650528 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mgcgk\"" Apr 21 04:05:46.659223 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.657313 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z"] Apr 21 04:05:46.744979 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.744937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwfgn\" (UniqueName: \"kubernetes.io/projected/47107d89-d540-48bc-8740-1672744e8434-kube-api-access-hwfgn\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.744979 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.744977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47107d89-d540-48bc-8740-1672744e8434-tmp\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.846030 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.845991 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfgn\" (UniqueName: \"kubernetes.io/projected/47107d89-d540-48bc-8740-1672744e8434-kube-api-access-hwfgn\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.846145 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.846061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47107d89-d540-48bc-8740-1672744e8434-tmp\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.846421 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.846404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47107d89-d540-48bc-8740-1672744e8434-tmp\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.853531 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.853510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfgn\" (UniqueName: \"kubernetes.io/projected/47107d89-d540-48bc-8740-1672744e8434-kube-api-access-hwfgn\") pod \"openshift-lws-operator-bfc7f696d-5gt8z\" (UID: \"47107d89-d540-48bc-8740-1672744e8434\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:46.961354 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:46.961269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" Apr 21 04:05:47.079895 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:47.079743 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z"] Apr 21 04:05:47.082481 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:05:47.082454 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47107d89_d540_48bc_8740_1672744e8434.slice/crio-0f9e25b4b91d41b55188872fb55c98d0f86bebf71f991fee7fb82fb49975ed14 WatchSource:0}: Error finding container 0f9e25b4b91d41b55188872fb55c98d0f86bebf71f991fee7fb82fb49975ed14: Status 404 returned error can't find the container with id 0f9e25b4b91d41b55188872fb55c98d0f86bebf71f991fee7fb82fb49975ed14 Apr 21 04:05:47.576685 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:47.576654 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" event={"ID":"47107d89-d540-48bc-8740-1672744e8434","Type":"ContainerStarted","Data":"0f9e25b4b91d41b55188872fb55c98d0f86bebf71f991fee7fb82fb49975ed14"} Apr 21 04:05:49.584724 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:49.584690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" event={"ID":"47107d89-d540-48bc-8740-1672744e8434","Type":"ContainerStarted","Data":"2409a58aa0a3bb581f42ae3a5d1b83fab90cd985f9334460b60b3b170c649aab"} Apr 21 04:05:49.599543 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:05:49.599499 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5gt8z" podStartSLOduration=1.301763826 podStartE2EDuration="3.599485422s" podCreationTimestamp="2026-04-21 04:05:46 +0000 UTC" firstStartedPulling="2026-04-21 04:05:47.083970358 +0000 UTC m=+510.511742211" lastFinishedPulling="2026-04-21 04:05:49.381691955 +0000 UTC m=+512.809463807" observedRunningTime="2026-04-21 04:05:49.598471143 +0000 UTC m=+513.026243040" watchObservedRunningTime="2026-04-21 04:05:49.599485422 +0000 UTC m=+513.027257297" Apr 21 04:06:20.011446 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.011367 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj"] Apr 21 04:06:20.013670 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.013652 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.015807 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.015786 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:06:20.016834 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.016815 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:06:20.016942 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.016858 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-xl2bs\"" Apr 21 04:06:20.016942 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.016866 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:06:20.023372 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.023348 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj"] Apr 21 04:06:20.093817 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.093777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-metrics-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.093817 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.093824 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.094031 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.093844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4657d863-7957-4d85-b78a-ac3eb614e0bc-manager-config\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.094031 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.093868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbcb\" (UniqueName: \"kubernetes.io/projected/4657d863-7957-4d85-b78a-ac3eb614e0bc-kube-api-access-rfbcb\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.194619 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.194584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.194619 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.194625 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4657d863-7957-4d85-b78a-ac3eb614e0bc-manager-config\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.194858 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.194652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbcb\" (UniqueName: \"kubernetes.io/projected/4657d863-7957-4d85-b78a-ac3eb614e0bc-kube-api-access-rfbcb\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.194858 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.194739 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-metrics-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.195250 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.195229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4657d863-7957-4d85-b78a-ac3eb614e0bc-manager-config\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.197139 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.197120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-metrics-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.197219 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.197177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4657d863-7957-4d85-b78a-ac3eb614e0bc-cert\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.203425 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.203407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbcb\" (UniqueName: \"kubernetes.io/projected/4657d863-7957-4d85-b78a-ac3eb614e0bc-kube-api-access-rfbcb\") pod \"lws-controller-manager-57f75ff788-kgttj\" (UID: \"4657d863-7957-4d85-b78a-ac3eb614e0bc\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.322738 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.322646 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:20.437760 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.437731 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj"] Apr 21 04:06:20.441750 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:06:20.441722 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4657d863_7957_4d85_b78a_ac3eb614e0bc.slice/crio-c94af4e05d8cc681489a808a7cc9135f3e7385294c4eec1f4eeb8be26ae1c5ee WatchSource:0}: Error finding container c94af4e05d8cc681489a808a7cc9135f3e7385294c4eec1f4eeb8be26ae1c5ee: Status 404 returned error can't find the container with id c94af4e05d8cc681489a808a7cc9135f3e7385294c4eec1f4eeb8be26ae1c5ee Apr 21 04:06:20.683210 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:20.683175 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" event={"ID":"4657d863-7957-4d85-b78a-ac3eb614e0bc","Type":"ContainerStarted","Data":"c94af4e05d8cc681489a808a7cc9135f3e7385294c4eec1f4eeb8be26ae1c5ee"} Apr 21 04:06:22.691962 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:22.691929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" event={"ID":"4657d863-7957-4d85-b78a-ac3eb614e0bc","Type":"ContainerStarted","Data":"052097d998c0f92af627b5be18c938acb1f2d73ce72d2dd4bc17b2afcfa763a5"} Apr 21 04:06:22.692369 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:22.691985 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:06:22.707722 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:22.707674 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" podStartSLOduration=1.93256465 podStartE2EDuration="3.707658821s" podCreationTimestamp="2026-04-21 04:06:19 +0000 UTC" firstStartedPulling="2026-04-21 04:06:20.44761639 +0000 UTC m=+543.875388243" lastFinishedPulling="2026-04-21 04:06:22.222710554 +0000 UTC m=+545.650482414" observedRunningTime="2026-04-21 04:06:22.70709186 +0000 UTC m=+546.134863729" watchObservedRunningTime="2026-04-21 04:06:22.707658821 +0000 UTC m=+546.135430697" Apr 21 04:06:33.697678 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:06:33.697648 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-kgttj" Apr 21 04:07:07.048725 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.048689 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-tv474"] Apr 21 04:07:07.050751 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.050735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:07.052854 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.052825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:07:07.052979 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.052857 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:07:07.052979 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.052903 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-xkw24\"" Apr 21 04:07:07.059123 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.059100 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-tv474"] Apr 21 04:07:07.128883 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.128848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzf6\" (UniqueName: \"kubernetes.io/projected/d59fc445-9cf1-4c56-8258-a5bbc616d1ca-kube-api-access-gpzf6\") pod \"authorino-operator-7587b89b76-tv474\" (UID: \"d59fc445-9cf1-4c56-8258-a5bbc616d1ca\") " pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:07.230101 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.230068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzf6\" (UniqueName: \"kubernetes.io/projected/d59fc445-9cf1-4c56-8258-a5bbc616d1ca-kube-api-access-gpzf6\") pod \"authorino-operator-7587b89b76-tv474\" (UID: \"d59fc445-9cf1-4c56-8258-a5bbc616d1ca\") " pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:07.246406 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.246374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzf6\" (UniqueName: \"kubernetes.io/projected/d59fc445-9cf1-4c56-8258-a5bbc616d1ca-kube-api-access-gpzf6\") pod \"authorino-operator-7587b89b76-tv474\" (UID: \"d59fc445-9cf1-4c56-8258-a5bbc616d1ca\") " pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:07.361532 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.361494 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:07.486112 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.486079 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-tv474"] Apr 21 04:07:07.489751 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:07:07.489722 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd59fc445_9cf1_4c56_8258_a5bbc616d1ca.slice/crio-6da4ee1bd39a01160b4c29af05f03e663c8d18fd811e5ba64cc092c5e84684a2 WatchSource:0}: Error finding container 6da4ee1bd39a01160b4c29af05f03e663c8d18fd811e5ba64cc092c5e84684a2: Status 404 returned error can't find the container with id 6da4ee1bd39a01160b4c29af05f03e663c8d18fd811e5ba64cc092c5e84684a2 Apr 21 04:07:07.828557 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:07.828469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" event={"ID":"d59fc445-9cf1-4c56-8258-a5bbc616d1ca","Type":"ContainerStarted","Data":"6da4ee1bd39a01160b4c29af05f03e663c8d18fd811e5ba64cc092c5e84684a2"} Apr 21 04:07:10.839967 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:10.839932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" event={"ID":"d59fc445-9cf1-4c56-8258-a5bbc616d1ca","Type":"ContainerStarted","Data":"c8572359761f90ff42e113f1766ff13e1447475c1b9eb25b286a56a788e80275"} Apr 21 04:07:10.840391 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:10.840160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:07:10.856982 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:10.856931 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" podStartSLOduration=0.978039235 podStartE2EDuration="3.856917968s" podCreationTimestamp="2026-04-21 04:07:07 +0000 UTC" firstStartedPulling="2026-04-21 04:07:07.492233962 +0000 UTC m=+590.920005815" lastFinishedPulling="2026-04-21 04:07:10.371112694 +0000 UTC m=+593.798884548" observedRunningTime="2026-04-21 04:07:10.855356334 +0000 UTC m=+594.283128212" watchObservedRunningTime="2026-04-21 04:07:10.856917968 +0000 UTC m=+594.284689842" Apr 21 04:07:17.052902 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:17.052863 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:07:17.053495 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:17.053471 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:07:21.845272 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:07:21.845242 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-tv474" Apr 21 04:12:17.075580 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:12:17.075502 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:12:17.077119 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:12:17.077095 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:17:17.094317 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:17:17.094286 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:17:17.095929 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:17:17.095776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:18:05.884583 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.884497 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bskcl/must-gather-94hzr"] Apr 21 04:18:05.886885 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.886863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:05.889420 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.889394 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bskcl\"/\"openshift-service-ca.crt\"" Apr 21 04:18:05.889420 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.889419 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bskcl\"/\"kube-root-ca.crt\"" Apr 21 04:18:05.890330 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.890316 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bskcl\"/\"default-dockercfg-k6dll\"" Apr 21 04:18:05.903886 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.903862 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bskcl/must-gather-94hzr"] Apr 21 04:18:05.966235 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.966202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:05.966235 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:05.966241 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnxm\" (UniqueName: \"kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.067375 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.067342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cnxm\" (UniqueName: \"kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.067532 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.067420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.067763 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.067745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.077637 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.077587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cnxm\" (UniqueName: \"kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm\") pod \"must-gather-94hzr\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.195424 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.195340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:06.322779 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.322661 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bskcl/must-gather-94hzr"] Apr 21 04:18:06.325951 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:18:06.325921 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea88884d_09d2_4193_8055_969c4e9441d1.slice/crio-bdbdfc8770e19728f368207888d91df9d54dea24351f582c487b234b476d58c4 WatchSource:0}: Error finding container bdbdfc8770e19728f368207888d91df9d54dea24351f582c487b234b476d58c4: Status 404 returned error can't find the container with id bdbdfc8770e19728f368207888d91df9d54dea24351f582c487b234b476d58c4 Apr 21 04:18:06.327666 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.327648 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:18:06.815212 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:06.815172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bskcl/must-gather-94hzr" event={"ID":"ea88884d-09d2-4193-8055-969c4e9441d1","Type":"ContainerStarted","Data":"bdbdfc8770e19728f368207888d91df9d54dea24351f582c487b234b476d58c4"} Apr 21 04:18:12.839597 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:12.839557 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bskcl/must-gather-94hzr" event={"ID":"ea88884d-09d2-4193-8055-969c4e9441d1","Type":"ContainerStarted","Data":"9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547"} Apr 21 04:18:12.839597 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:12.839599 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bskcl/must-gather-94hzr" event={"ID":"ea88884d-09d2-4193-8055-969c4e9441d1","Type":"ContainerStarted","Data":"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630"} Apr 21 04:18:12.854957 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:12.854897 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bskcl/must-gather-94hzr" podStartSLOduration=1.924778715 podStartE2EDuration="7.854879687s" podCreationTimestamp="2026-04-21 04:18:05 +0000 UTC" firstStartedPulling="2026-04-21 04:18:06.327796315 +0000 UTC m=+1249.755568168" lastFinishedPulling="2026-04-21 04:18:12.257897284 +0000 UTC m=+1255.685669140" observedRunningTime="2026-04-21 04:18:12.853541827 +0000 UTC m=+1256.281313703" watchObservedRunningTime="2026-04-21 04:18:12.854879687 +0000 UTC m=+1256.282651563" Apr 21 04:18:25.881557 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:25.881518 2570 generic.go:358] "Generic (PLEG): container finished" podID="ea88884d-09d2-4193-8055-969c4e9441d1" containerID="191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630" exitCode=0 Apr 21 04:18:25.882114 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:25.881598 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bskcl/must-gather-94hzr" event={"ID":"ea88884d-09d2-4193-8055-969c4e9441d1","Type":"ContainerDied","Data":"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630"} Apr 21 04:18:25.882114 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:25.882021 2570 scope.go:117] "RemoveContainer" containerID="191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630" Apr 21 04:18:26.592318 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:26.592286 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bskcl_must-gather-94hzr_ea88884d-09d2-4193-8055-969c4e9441d1/gather/0.log" Apr 21 04:18:29.812763 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:29.812737 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m69zf_76716b30-cec0-4d8e-8c08-452dfeb18893/global-pull-secret-syncer/0.log" Apr 21 04:18:29.980580 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:29.980552 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kmbjr_c98b9edd-432f-4ffa-a024-8b9f651147e0/konnectivity-agent/0.log" Apr 21 04:18:29.998160 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:29.998132 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-93.ec2.internal_364f28bb57f65923dbeacedd6b253c36/haproxy/0.log" Apr 21 04:18:31.922560 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:31.922522 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bskcl/must-gather-94hzr"] Apr 21 04:18:31.922989 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:31.922768 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-bskcl/must-gather-94hzr" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="copy" containerID="cri-o://9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547" gracePeriod=2 Apr 21 04:18:31.925057 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:31.925012 2570 status_manager.go:895] "Failed to get status for pod" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" pod="openshift-must-gather-bskcl/must-gather-94hzr" err="pods \"must-gather-94hzr\" is forbidden: User \"system:node:ip-10-0-131-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bskcl\": no relationship found between node 'ip-10-0-131-93.ec2.internal' and this object" Apr 21 04:18:31.925915 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:31.925890 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bskcl/must-gather-94hzr"] Apr 21 04:18:32.150463 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.150438 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bskcl_must-gather-94hzr_ea88884d-09d2-4193-8055-969c4e9441d1/copy/0.log" Apr 21 04:18:32.150795 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.150780 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:32.152617 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.152593 2570 status_manager.go:895] "Failed to get status for pod" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" pod="openshift-must-gather-bskcl/must-gather-94hzr" err="pods \"must-gather-94hzr\" is forbidden: User \"system:node:ip-10-0-131-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bskcl\": no relationship found between node 'ip-10-0-131-93.ec2.internal' and this object" Apr 21 04:18:32.276183 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.276094 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output\") pod \"ea88884d-09d2-4193-8055-969c4e9441d1\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " Apr 21 04:18:32.276335 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.276198 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cnxm\" (UniqueName: \"kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm\") pod \"ea88884d-09d2-4193-8055-969c4e9441d1\" (UID: \"ea88884d-09d2-4193-8055-969c4e9441d1\") " Apr 21 04:18:32.277965 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.277933 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ea88884d-09d2-4193-8055-969c4e9441d1" (UID: "ea88884d-09d2-4193-8055-969c4e9441d1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:18:32.278444 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.278411 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm" (OuterVolumeSpecName: "kube-api-access-5cnxm") pod "ea88884d-09d2-4193-8055-969c4e9441d1" (UID: "ea88884d-09d2-4193-8055-969c4e9441d1"). InnerVolumeSpecName "kube-api-access-5cnxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:18:32.377322 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.377274 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cnxm\" (UniqueName: \"kubernetes.io/projected/ea88884d-09d2-4193-8055-969c4e9441d1-kube-api-access-5cnxm\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:18:32.377322 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.377314 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea88884d-09d2-4193-8055-969c4e9441d1-must-gather-output\") on node \"ip-10-0-131-93.ec2.internal\" DevicePath \"\"" Apr 21 04:18:32.904433 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.904402 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bskcl_must-gather-94hzr_ea88884d-09d2-4193-8055-969c4e9441d1/copy/0.log" Apr 21 04:18:32.904738 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.904715 2570 generic.go:358] "Generic (PLEG): container finished" podID="ea88884d-09d2-4193-8055-969c4e9441d1" containerID="9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547" exitCode=143 Apr 21 04:18:32.904808 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.904765 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bskcl/must-gather-94hzr" Apr 21 04:18:32.904808 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.904803 2570 scope.go:117] "RemoveContainer" containerID="9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547" Apr 21 04:18:32.907048 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.907007 2570 status_manager.go:895] "Failed to get status for pod" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" pod="openshift-must-gather-bskcl/must-gather-94hzr" err="pods \"must-gather-94hzr\" is forbidden: User \"system:node:ip-10-0-131-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bskcl\": no relationship found between node 'ip-10-0-131-93.ec2.internal' and this object" Apr 21 04:18:32.912814 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.912794 2570 scope.go:117] "RemoveContainer" containerID="191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630" Apr 21 04:18:32.915534 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.915510 2570 status_manager.go:895] "Failed to get status for pod" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" pod="openshift-must-gather-bskcl/must-gather-94hzr" err="pods \"must-gather-94hzr\" is forbidden: User \"system:node:ip-10-0-131-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bskcl\": no relationship found between node 'ip-10-0-131-93.ec2.internal' and this object" Apr 21 04:18:32.924113 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.923941 2570 scope.go:117] "RemoveContainer" containerID="9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547" Apr 21 04:18:32.924323 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:18:32.924211 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547\": container with ID starting with 9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547 not found: ID does not exist" containerID="9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547" Apr 21 04:18:32.924323 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.924234 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547"} err="failed to get container status \"9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547\": rpc error: code = NotFound desc = could not find container \"9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547\": container with ID starting with 9e339e0a0addf5b15f75d5d2aed1659adb823d5f5c0281de33c9c82020ac7547 not found: ID does not exist" Apr 21 04:18:32.924323 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.924251 2570 scope.go:117] "RemoveContainer" containerID="191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630" Apr 21 04:18:32.924475 ip-10-0-131-93 kubenswrapper[2570]: E0421 04:18:32.924457 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630\": container with ID starting with 191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630 not found: ID does not exist" containerID="191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630" Apr 21 04:18:32.924513 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:32.924481 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630"} err="failed to get container status \"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630\": rpc error: code = NotFound desc = could not find container \"191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630\": container with ID starting with 191689afd05b3a35fa63b48bb73d7b5fc3617ba5cd99ce9d70b9c2e595526630 not found: ID does not exist" Apr 21 04:18:33.160414 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:33.160332 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" path="/var/lib/kubelet/pods/ea88884d-09d2-4193-8055-969c4e9441d1/volumes" Apr 21 04:18:33.969958 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:33.969930 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-tv474_d59fc445-9cf1-4c56-8258-a5bbc616d1ca/manager/0.log" Apr 21 04:18:35.425749 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:35.425723 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xcz4_e88ce4e9-847b-4070-a0fe-fb0df7a4e988/node-exporter/0.log" Apr 21 04:18:35.448276 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:35.448243 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xcz4_e88ce4e9-847b-4070-a0fe-fb0df7a4e988/kube-rbac-proxy/0.log" Apr 21 04:18:35.468266 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:35.468241 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xcz4_e88ce4e9-847b-4070-a0fe-fb0df7a4e988/init-textfile/0.log" Apr 21 04:18:37.271884 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:37.271850 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-cctsl_97f709de-df2d-4000-add8-f6588eda3b15/networking-console-plugin/0.log" Apr 21 04:18:38.717223 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717186 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp"] Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717480 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="gather" Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717492 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="gather" Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717500 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="copy" Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717506 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="copy" Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717560 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="gather" Apr 21 04:18:38.717603 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.717571 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea88884d-09d2-4193-8055-969c4e9441d1" containerName="copy" Apr 21 04:18:38.721468 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.721447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.723523 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.723496 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"openshift-service-ca.crt\"" Apr 21 04:18:38.724813 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.724786 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"kube-root-ca.crt\"" Apr 21 04:18:38.724917 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.724815 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dhdf9\"/\"default-dockercfg-h8wlt\"" Apr 21 04:18:38.726187 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.726165 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp"] Apr 21 04:18:38.829405 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.829366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-lib-modules\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.829405 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.829403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-sys\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.829635 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.829426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-proc\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.829635 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.829495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tgz\" (UniqueName: \"kubernetes.io/projected/78371d12-0aee-492d-80f1-15eb881a10cc-kube-api-access-v8tgz\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.829635 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.829525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-podres\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930512 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-proc\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tgz\" (UniqueName: \"kubernetes.io/projected/78371d12-0aee-492d-80f1-15eb881a10cc-kube-api-access-v8tgz\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-podres\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-proc\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-lib-modules\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-sys\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-sys\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930711 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-lib-modules\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.930956 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.930737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/78371d12-0aee-492d-80f1-15eb881a10cc-podres\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:38.938872 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:38.938851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tgz\" (UniqueName: \"kubernetes.io/projected/78371d12-0aee-492d-80f1-15eb881a10cc-kube-api-access-v8tgz\") pod \"perf-node-gather-daemonset-m5ztp\" (UID: \"78371d12-0aee-492d-80f1-15eb881a10cc\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:39.032365 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.032274 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:39.148089 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.148054 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp"] Apr 21 04:18:39.151098 ip-10-0-131-93 kubenswrapper[2570]: W0421 04:18:39.151067 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod78371d12_0aee_492d_80f1_15eb881a10cc.slice/crio-149d4f0670b42e38d20d097e0631e9cbdf225d8e8005936098d3b2c5b3804b47 WatchSource:0}: Error finding container 149d4f0670b42e38d20d097e0631e9cbdf225d8e8005936098d3b2c5b3804b47: Status 404 returned error can't find the container with id 149d4f0670b42e38d20d097e0631e9cbdf225d8e8005936098d3b2c5b3804b47 Apr 21 04:18:39.390231 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.390204 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-879kk_6fd9ab38-f364-44df-8d61-4e3ba0946953/dns/0.log" Apr 21 04:18:39.410562 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.410540 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-879kk_6fd9ab38-f364-44df-8d61-4e3ba0946953/kube-rbac-proxy/0.log" Apr 21 04:18:39.516793 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.516763 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-26mzt_8a8c6dad-8135-4d60-b437-56307544e064/dns-node-resolver/0.log" Apr 21 04:18:39.928498 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.928464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" event={"ID":"78371d12-0aee-492d-80f1-15eb881a10cc","Type":"ContainerStarted","Data":"52a3d1a7226fb4db4ce8f9da75f6231828f29a81a628a27f336aab0395cf0d16"} Apr 21 04:18:39.928498 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.928502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" event={"ID":"78371d12-0aee-492d-80f1-15eb881a10cc","Type":"ContainerStarted","Data":"149d4f0670b42e38d20d097e0631e9cbdf225d8e8005936098d3b2c5b3804b47"} Apr 21 04:18:39.928987 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.928589 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:39.943659 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:39.943610 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" podStartSLOduration=1.94359561 podStartE2EDuration="1.94359561s" podCreationTimestamp="2026-04-21 04:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:18:39.942707955 +0000 UTC m=+1283.370479811" watchObservedRunningTime="2026-04-21 04:18:39.94359561 +0000 UTC m=+1283.371367485" Apr 21 04:18:40.063545 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:40.063519 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jw8wp_2b3ca347-33c6-4fbc-bc3d-a93f9f0c27fa/node-ca/0.log" Apr 21 04:18:41.299714 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:41.299686 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lcnh4_e78af541-e68e-436e-8f47-28080dca3c2d/serve-healthcheck-canary/0.log" Apr 21 04:18:41.818211 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:41.818183 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhkjb_a6921503-4d28-40e8-ad65-db03edc30976/kube-rbac-proxy/0.log" Apr 21 04:18:41.838094 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:41.838067 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhkjb_a6921503-4d28-40e8-ad65-db03edc30976/exporter/0.log" Apr 21 04:18:41.859395 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:41.859371 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhkjb_a6921503-4d28-40e8-ad65-db03edc30976/extractor/0.log" Apr 21 04:18:43.984374 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:43.984333 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-57f75ff788-kgttj_4657d863-7957-4d85-b78a-ac3eb614e0bc/manager/0.log" Apr 21 04:18:44.031160 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:44.031131 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-5gt8z_47107d89-d540-48bc-8740-1672744e8434/openshift-lws-operator/0.log" Apr 21 04:18:45.942209 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:45.942180 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-m5ztp" Apr 21 04:18:49.391407 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.391379 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/kube-multus-additional-cni-plugins/0.log" Apr 21 04:18:49.413882 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.413859 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/egress-router-binary-copy/0.log" Apr 21 04:18:49.436433 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.436403 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/cni-plugins/0.log" Apr 21 04:18:49.457695 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.457672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/bond-cni-plugin/0.log" Apr 21 04:18:49.484314 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.484281 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/routeoverride-cni/0.log" Apr 21 04:18:49.504241 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.504219 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/whereabouts-cni-bincopy/0.log" Apr 21 04:18:49.524472 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.524441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xddzt_d9c4a2fd-4534-495a-8f40-6d8faf8f87e6/whereabouts-cni/0.log" Apr 21 04:18:49.609612 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.609589 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zrwkk_114ff0f2-95bc-49bb-be65-079af4d8294d/kube-multus/0.log" Apr 21 04:18:49.719045 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.719010 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lf2dl_0642b1aa-ff76-4694-bad0-be2656b81005/network-metrics-daemon/0.log" Apr 21 04:18:49.743233 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:49.743207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lf2dl_0642b1aa-ff76-4694-bad0-be2656b81005/kube-rbac-proxy/0.log" Apr 21 04:18:50.562233 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.562208 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-controller/0.log" Apr 21 04:18:50.582171 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.582145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/0.log" Apr 21 04:18:50.587728 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.587707 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovn-acl-logging/1.log" Apr 21 04:18:50.606993 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.606964 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/kube-rbac-proxy-node/0.log" Apr 21 04:18:50.629876 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.629847 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:18:50.647522 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.647497 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/northd/0.log" Apr 21 04:18:50.670549 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.670524 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/nbdb/0.log" Apr 21 04:18:50.688014 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.687991 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/sbdb/0.log" Apr 21 04:18:50.774908 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:50.774878 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6p6fc_3a284a31-b4ea-4280-a6d1-b84390d1488d/ovnkube-controller/0.log" Apr 21 04:18:52.548965 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:52.548930 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ztjfh_be04fd1e-83bf-49d7-8c60-4323b986ab81/network-check-target-container/0.log" Apr 21 04:18:53.511277 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:53.511249 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-26xqh_6a2e2e97-0696-4765-bf51-be31ec3c66ba/iptables-alerter/0.log" Apr 21 04:18:54.239148 ip-10-0-131-93 kubenswrapper[2570]: I0421 04:18:54.239121 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-65jhd_26270642-aaa5-4b43-804b-56317d766266/tuned/0.log"