Apr 23 16:32:10.718731 ip-10-0-142-4 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:10.718747 ip-10-0-142-4 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:10.718758 ip-10-0-142-4 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:10.719095 ip-10-0-142-4 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:20.911073 ip-10-0-142-4 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:20.911093 ip-10-0-142-4 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2b83ea6f9f164271897b5db5cd2716ec -- Apr 23 16:34:49.625903 ip-10-0-142-4 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:34:50.029635 ip-10-0-142-4 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:50.029635 ip-10-0-142-4 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:34:50.029635 ip-10-0-142-4 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:50.029635 ip-10-0-142-4 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:34:50.029635 ip-10-0-142-4 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:50.031091 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.030997 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036711 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036735 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036739 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036742 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036747 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:50.036740 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036750 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036753 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036756 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036761 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036765 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036768 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036770 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036773 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036776 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036779 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036781 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036783 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036786 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036789 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036791 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036794 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036798 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036802 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036805 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:50.037029 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036808 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036811 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036814 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036817 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036820 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036823 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036826 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036828 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036831 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036834 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036836 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036839 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036841 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036843 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036846 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036849 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036853 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036856 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036858 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036861 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:50.037479 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036865 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036867 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036870 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036873 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036876 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036879 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036892 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036896 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036898 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036901 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036904 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036907 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036910 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036912 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036915 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036917 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036920 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036922 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036925 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036927 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:50.038060 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036930 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036932 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036935 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036937 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036940 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036943 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036945 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036949 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036952 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036955 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036958 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036960 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036963 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036966 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036977 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036980 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036982 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036985 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036989 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036992 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:50.038536 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036994 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.036997 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037479 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037485 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037488 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037491 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037494 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037497 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037500 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037502 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037505 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037507 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037510 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037513 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037517 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037520 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037522 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037525 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037527 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:50.039033 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037531 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037534 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037536 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037539 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037541 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037544 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037547 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037555 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037558 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037561 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037564 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037566 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037569 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037571 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037593 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037597 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037601 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037604 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037606 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:50.039486 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037609 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037612 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037614 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037617 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037619 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037622 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037624 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037627 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037629 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037632 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037634 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037637 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037639 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037644 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037648 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037651 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037655 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037658 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037661 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:50.039964 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037664 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037666 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037678 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037681 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037683 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037686 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037688 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037691 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037693 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037696 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037698 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037701 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037703 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037706 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037708 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037710 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037713 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037715 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037718 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037720 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:50.040434 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037723 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037725 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037727 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037730 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037732 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037735 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037738 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037741 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037743 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037746 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.037748 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037831 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037839 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037855 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037860 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037874 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037878 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037882 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037887 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037890 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037894 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037897 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:34:50.041067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037900 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037903 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037906 2580 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037909 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037912 2580 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037915 2580 flags.go:64] FLAG: --cloud-config="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037918 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037920 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037929 2580 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037932 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037935 2580 flags.go:64] FLAG: --config-dir="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037937 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037941 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037945 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037948 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037951 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037955 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037959 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037962 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037965 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037968 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037971 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037975 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037978 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037981 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:34:50.041652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037983 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037992 2580 flags.go:64] FLAG: --enable-server="true" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.037995 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038002 2580 flags.go:64] FLAG: --event-burst="100" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038005 2580 flags.go:64] FLAG: --event-qps="50" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038014 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038017 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038020 2580 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038024 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038027 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038030 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038033 2580 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038036 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038039 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038042 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038045 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038048 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038051 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038054 2580 flags.go:64] FLAG: --feature-gates="" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038057 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038061 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038064 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038068 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038071 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038074 2580 flags.go:64] FLAG: --help="false" Apr 23 16:34:50.042247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038077 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038080 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038083 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038086 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038089 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038093 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038096 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038098 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038101 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038110 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038113 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038116 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038119 2580 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038122 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038124 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038127 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038130 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038133 2580 flags.go:64] FLAG: --lock-file="" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038136 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038138 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038141 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038147 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038149 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038152 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:34:50.042905 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038155 2580 flags.go:64] FLAG: --logging-format="text" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038158 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038161 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038164 2580 flags.go:64] FLAG: --manifest-url="" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038179 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038185 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038189 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038194 2580 flags.go:64] FLAG: --max-pods="110" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038197 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038200 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038203 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038206 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038209 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038212 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038215 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038223 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038226 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038229 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038234 2580 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038237 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038242 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038245 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038249 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038252 2580 flags.go:64] FLAG: --port="10250" Apr 23 16:34:50.043462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038255 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038258 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09ed95831d6100a96" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038261 2580 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038264 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038267 2580 flags.go:64] FLAG: --register-node="true" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038270 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038273 2580 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038277 2580 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038280 2580 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038283 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038286 2580 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038290 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038293 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038296 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038300 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038303 2580 flags.go:64] FLAG: --runonce="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038305 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038309 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038312 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038315 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038318 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038321 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038324 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038327 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038329 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038332 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:34:50.044047 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038335 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038339 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038342 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038344 2580 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038350 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038355 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038358 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038361 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038368 2580 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038371 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038374 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038377 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038380 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038383 2580 flags.go:64] FLAG: --v="2" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038387 2580 flags.go:64] FLAG: --version="false" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038391 2580 flags.go:64] FLAG: --vmodule="" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038396 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038399 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038502 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038506 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038509 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038512 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038515 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038518 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:50.044717 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038520 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038523 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038525 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038529 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038533 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038536 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038539 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038542 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038546 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038549 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038552 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038554 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038558 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038561 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038563 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038566 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038568 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038571 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038591 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:50.045302 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038594 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038596 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038599 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038601 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038604 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038607 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038609 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038612 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038614 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038617 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038619 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038622 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038625 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038627 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038630 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038632 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038635 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038637 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038640 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038642 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:50.045804 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038645 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038647 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038650 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038652 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038656 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038659 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038662 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038665 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038667 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038670 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038672 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038677 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038680 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038683 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038686 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038688 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038691 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038693 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038696 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:50.046319 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038698 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038701 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038703 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038706 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038709 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038711 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038714 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038716 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038719 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038722 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038724 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038727 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038729 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038732 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038734 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038737 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038739 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038742 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038746 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038749 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:50.046794 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038752 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.038754 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.038760 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.045689 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.045709 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045764 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045770 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045773 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045776 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045779 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045782 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045786 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045788 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045791 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045794 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045796 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:50.047275 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045799 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045802 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045804 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045807 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045810 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045812 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045816 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045820 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045823 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045826 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045829 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045832 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045836 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045838 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045843 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045845 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045848 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045850 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045853 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:50.047687 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045855 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045858 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045861 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045863 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045866 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045868 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045871 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045873 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045876 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045878 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045880 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045883 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045886 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045888 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045891 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045894 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045896 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045898 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045901 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045903 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:50.048155 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045906 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045908 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045911 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045913 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045916 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045918 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045921 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045926 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045930 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045933 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045935 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045938 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045940 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045943 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045946 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045948 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045951 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045953 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045956 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045958 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:50.048642 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045961 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045963 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045965 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045968 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045971 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045973 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045976 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045978 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045981 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045983 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045986 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045989 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045992 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045994 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045996 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.045999 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:50.049149 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.046004 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046116 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046121 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046125 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046128 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046132 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046135 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046138 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046141 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046143 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046146 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046149 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046151 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046154 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046156 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046159 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046162 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046164 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046166 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046169 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:50.049595 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046171 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046174 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046176 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046179 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046181 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046184 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046186 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046189 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046191 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046193 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046196 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046199 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046201 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046204 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046207 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046211 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046214 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046217 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046220 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:50.050145 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046223 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046226 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046228 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046230 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046233 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046236 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046239 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046241 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046244 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046246 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046249 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046251 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046254 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046257 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046259 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046261 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046264 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046267 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046269 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046271 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:50.050629 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046274 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046276 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046279 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046281 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046283 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046286 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046288 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046290 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046293 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046296 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046298 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046301 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046304 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046306 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046309 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046314 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046316 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046318 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046321 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046324 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:50.051121 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046327 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046329 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046332 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046335 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046337 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046340 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046342 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:50.046345 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.046350 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.047000 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.049043 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.049893 2580 server.go:1019] "Starting client certificate rotation" Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.049995 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:50.051628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.050043 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:50.073531 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.073504 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:50.078678 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.076030 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:50.093050 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.093021 2580 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:34:50.098605 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.098563 2580 log.go:25] "Validated CRI v1 image API" Apr 23 16:34:50.099891 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.099873 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:34:50.103736 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.103713 2580 fs.go:135] Filesystem UUIDs: map[31ba0b87-ec12-497c-879b-d8c1ab1c7199:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d92fc784-1b64-42cc-b95b-bd4572a383ce:/dev/nvme0n1p4] Apr 23 16:34:50.103814 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.103736 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:34:50.109473 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.109352 2580 manager.go:217] Machine: {Timestamp:2026-04-23 16:34:50.107660741 +0000 UTC m=+0.370446666 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3109892 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ef54f62e9d1c070f8fe36fb917e08 SystemUUID:ec2ef54f-62e9-d1c0-70f8-fe36fb917e08 BootID:2b83ea6f-9f16-4271-897b-5db5cd2716ec Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9b:4c:40:ec:99 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9b:4c:40:ec:99 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:11:05:c3:4b:00 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:34:50.109473 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.109467 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:34:50.109597 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.109561 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:34:50.110550 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.110522 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:34:50.110716 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.110553 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-4.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:34:50.110760 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.110725 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:34:50.110760 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.110734 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:34:50.110760 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.110748 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:50.111427 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.111416 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:50.112667 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.112656 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:50.112794 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.112785 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:34:50.115599 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.115587 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:34:50.115646 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.115612 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:34:50.115646 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.115625 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:34:50.115646 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.115635 2580 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:34:50.115646 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.115645 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:34:50.116784 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.116772 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:50.116820 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.116791 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:50.120638 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.120612 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:50.120939 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.120920 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:34:50.123089 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.123073 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:34:50.124299 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124287 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124308 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124315 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124320 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124326 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124332 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124338 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124344 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124351 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:34:50.124359 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124357 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:34:50.124617 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124366 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:34:50.124617 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.124374 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:34:50.125043 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.125034 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:34:50.125043 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.125044 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:34:50.129232 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.129218 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:34:50.129288 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.129267 2580 server.go:1295] "Started kubelet" Apr 23 16:34:50.129373 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.129343 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:34:50.129423 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.129378 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:34:50.129459 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.129449 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:34:50.130376 ip-10-0-142-4 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:34:50.131015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.130819 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:34:50.131318 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.131289 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:34:50.131318 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.131301 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-4.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:34:50.131422 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.131362 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-4.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:34:50.131612 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.131600 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:34:50.136268 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.136245 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:50.136750 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.136728 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:34:50.137439 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137415 2580 factory.go:55] Registering systemd factory Apr 23 16:34:50.137533 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137508 2580 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:34:50.137719 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.137703 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.137776 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137750 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:34:50.137776 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137760 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:34:50.137852 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137810 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:34:50.137890 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137868 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:34:50.137931 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137902 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:34:50.137989 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137975 2580 factory.go:153] Registering CRI-O factory Apr 23 16:34:50.138027 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.137994 2580 factory.go:223] Registration of the crio container factory successfully Apr 23 16:34:50.138070 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.138048 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:34:50.138113 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.138076 2580 factory.go:103] Registering Raw factory Apr 23 16:34:50.138113 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.138092 2580 manager.go:1196] Started watching for new ooms in manager Apr 23 16:34:50.138591 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.138526 2580 manager.go:319] Starting recovery of all containers Apr 23 16:34:50.139752 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.139725 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:34:50.148446 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.148229 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-4.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:34:50.148692 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.148668 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:34:50.151524 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.151504 2580 manager.go:324] Recovery completed Apr 23 16:34:50.153121 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.148839 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-4.ec2.internal.18a9099b317d1663 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-4.ec2.internal,UID:ip-10-0-142-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-4.ec2.internal,},FirstTimestamp:2026-04-23 16:34:50.129233507 +0000 UTC m=+0.392019433,LastTimestamp:2026-04-23 16:34:50.129233507 +0000 UTC m=+0.392019433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-4.ec2.internal,}" Apr 23 16:34:50.154499 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.154470 2580 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 16:34:50.157640 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.157625 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.162042 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162025 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.162117 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162055 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.162117 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162067 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.162843 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162827 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:34:50.162843 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162841 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:34:50.162958 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.162860 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:50.165112 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.165098 2580 policy_none.go:49] "None policy: Start" Apr 23 16:34:50.165112 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.165115 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:34:50.165211 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.165127 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:34:50.167692 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.167617 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-4.ec2.internal.18a9099b3371b2c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-4.ec2.internal,UID:ip-10-0-142-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-4.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-4.ec2.internal,},FirstTimestamp:2026-04-23 16:34:50.162041538 +0000 UTC m=+0.424827462,LastTimestamp:2026-04-23 16:34:50.162041538 +0000 UTC m=+0.424827462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-4.ec2.internal,}" Apr 23 16:34:50.183708 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.183626 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-4.ec2.internal.18a9099b3371fecc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-4.ec2.internal,UID:ip-10-0-142-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-142-4.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-142-4.ec2.internal,},FirstTimestamp:2026-04-23 16:34:50.162061004 +0000 UTC m=+0.424846928,LastTimestamp:2026-04-23 16:34:50.162061004 +0000 UTC m=+0.424846928,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-4.ec2.internal,}" Apr 23 16:34:50.193455 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.193433 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lgdr9" Apr 23 16:34:50.226863 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.201400 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-4.ec2.internal.18a9099b33722468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-4.ec2.internal,UID:ip-10-0-142-4.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-142-4.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-142-4.ec2.internal,},FirstTimestamp:2026-04-23 16:34:50.162070632 +0000 UTC m=+0.424856555,LastTimestamp:2026-04-23 16:34:50.162070632 +0000 UTC m=+0.424856555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-4.ec2.internal,}" Apr 23 16:34:50.226863 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.213589 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lgdr9" Apr 23 16:34:50.238016 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.237979 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.254524 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.254503 2580 manager.go:341] "Starting Device Plugin manager" Apr 23 16:34:50.254717 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.254565 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:34:50.254717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.254597 2580 server.go:85] "Starting device plugin registration server" Apr 23 16:34:50.254941 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.254925 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:34:50.254989 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.254942 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:34:50.255101 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.255066 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:34:50.255232 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.255176 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:34:50.255232 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.255186 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:34:50.255681 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.255663 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:34:50.255791 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.255708 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.275421 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.275385 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:34:50.277019 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.276992 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:34:50.277336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.277025 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:34:50.277336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.277054 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:34:50.277336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.277065 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:34:50.277336 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.277109 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:34:50.279848 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.279779 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:50.354918 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.354880 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-4.ec2.internal\" not found" node="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.355886 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.355872 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.357127 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.357110 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.357239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.357143 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.357239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.357154 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.357239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.357183 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.365680 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.365652 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.365777 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.365685 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-4.ec2.internal\": node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.377485 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.377444 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal"] Apr 23 16:34:50.377659 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.377540 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.378484 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.378467 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.378592 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.378501 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.378592 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.378517 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.380901 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.380873 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.381003 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.380986 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.381053 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381039 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.381674 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381660 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.381731 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381684 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.381731 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381694 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.381802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381660 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.381802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381768 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.381802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.381777 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.383821 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.383801 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.383921 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.383828 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:50.384507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.384486 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:50.384607 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.384515 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:50.384607 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.384525 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:50.398461 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.398434 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.408446 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.408417 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-4.ec2.internal\" not found" node="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.413116 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.413097 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-4.ec2.internal\" not found" node="ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.439252 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.439217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ca4686b7542221f742a8d14ace1d047-config\") pod \"kube-apiserver-proxy-ip-10-0-142-4.ec2.internal\" (UID: \"6ca4686b7542221f742a8d14ace1d047\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.439415 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.439262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.439415 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.439282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.499630 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.499571 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.540458 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ca4686b7542221f742a8d14ace1d047-config\") pod \"kube-apiserver-proxy-ip-10-0-142-4.ec2.internal\" (UID: \"6ca4686b7542221f742a8d14ace1d047\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.540458 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540396 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.540458 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.540458 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.540756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ca4686b7542221f742a8d14ace1d047-config\") pod \"kube-apiserver-proxy-ip-10-0-142-4.ec2.internal\" (UID: \"6ca4686b7542221f742a8d14ace1d047\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.540756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.540492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/96c9e4ad769a0a0462456abe60e537d4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal\" (UID: \"96c9e4ad769a0a0462456abe60e537d4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.600682 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.600652 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.701342 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.701297 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.710506 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.710479 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.715010 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:50.714987 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:50.801971 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.801871 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:50.902406 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:50.902371 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.002978 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.002943 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.050690 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.050662 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:34:51.051376 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.050816 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:51.104073 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.103987 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.136972 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.136943 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:51.156881 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.156839 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:51.161988 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.161968 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:51.186992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.186956 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gffzz" Apr 23 16:34:51.198511 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.198483 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gffzz" Apr 23 16:34:51.204919 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.204897 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.215060 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.215030 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:29:50 +0000 UTC" deadline="2027-10-20 16:27:36.902761703 +0000 UTC" Apr 23 16:34:51.215129 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.215065 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13079h52m45.687705092s" Apr 23 16:34:51.256192 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.256167 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:51.270791 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:51.270759 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca4686b7542221f742a8d14ace1d047.slice/crio-5d42a577628371afc58184367e4e8e6c89c93925467849c01eb6e8d2022e026d WatchSource:0}: Error finding container 5d42a577628371afc58184367e4e8e6c89c93925467849c01eb6e8d2022e026d: Status 404 returned error can't find the container with id 5d42a577628371afc58184367e4e8e6c89c93925467849c01eb6e8d2022e026d Apr 23 16:34:51.270971 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:51.270960 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c9e4ad769a0a0462456abe60e537d4.slice/crio-5fbdaf38f616389d150efc9bbb141ffb5da71088232d87a8ad131b44291bcb79 WatchSource:0}: Error finding container 5fbdaf38f616389d150efc9bbb141ffb5da71088232d87a8ad131b44291bcb79: Status 404 returned error can't find the container with id 5fbdaf38f616389d150efc9bbb141ffb5da71088232d87a8ad131b44291bcb79 Apr 23 16:34:51.275928 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.275885 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:34:51.293547 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.293495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" event={"ID":"6ca4686b7542221f742a8d14ace1d047","Type":"ContainerStarted","Data":"5d42a577628371afc58184367e4e8e6c89c93925467849c01eb6e8d2022e026d"} Apr 23 16:34:51.294450 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.294430 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" event={"ID":"96c9e4ad769a0a0462456abe60e537d4","Type":"ContainerStarted","Data":"5fbdaf38f616389d150efc9bbb141ffb5da71088232d87a8ad131b44291bcb79"} Apr 23 16:34:51.305622 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.305599 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.406263 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.406161 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.506786 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.506744 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.607424 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:51.607385 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-4.ec2.internal\" not found" Apr 23 16:34:51.624327 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.624286 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:51.638453 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.638416 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" Apr 23 16:34:51.650353 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.650325 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:51.651157 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.651134 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" Apr 23 16:34:51.663674 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.663595 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:51.985364 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:51.985320 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:52.116966 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.116932 2580 apiserver.go:52] "Watching apiserver" Apr 23 16:34:52.126193 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.126165 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:34:52.126701 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.126668 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal","openshift-multus/multus-8rnmk","openshift-multus/multus-additional-cni-plugins-58q6m","openshift-multus/network-metrics-daemon-f889w","openshift-ovn-kubernetes/ovnkube-node-xvn7t","kube-system/konnectivity-agent-cdgwc","kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal","openshift-cluster-node-tuning-operator/tuned-kwzbs","openshift-dns/node-resolver-78hbm","openshift-image-registry/node-ca-rshv8","openshift-network-diagnostics/network-check-target-7bn2z","openshift-network-operator/iptables-alerter-qg8fj"] Apr 23 16:34:52.130124 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.130101 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.132654 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.132624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7b98n\"" Apr 23 16:34:52.132762 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.132663 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:34:52.132762 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.132715 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:34:52.134742 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.134691 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.136956 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.136935 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.137062 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.136979 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.137062 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.137046 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:34:52.137171 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.136936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.137247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.137228 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.137481 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.137463 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:34:52.138158 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.137728 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:34:52.138158 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.137778 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tx5dc\"" Apr 23 16:34:52.139745 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.139665 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qlnsr\"" Apr 23 16:34:52.139745 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.139738 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:34:52.140404 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.140387 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:34:52.141872 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.141849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.144127 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.144144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.144218 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.144361 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.144433 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.145072 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-94hm9\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.146621 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nhpz7\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.146823 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.147732 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.147042 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:34:52.148101 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.147822 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.148101 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.147867 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.149008 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.148981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-system-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149111 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149111 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149048 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-socket-dir-parent\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149111 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-systemd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149111 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-ovn\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-log-socket\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149140 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7f20dda8-0907-46fa-84c0-d1304b1105df-konnectivity-ca\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149161 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-cnibin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149185 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-os-release\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149208 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-k8s-cni-cncf-io\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-multus\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149269 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgf5\" (UniqueName: \"kubernetes.io/projected/76787c69-1999-41dd-9713-d68801605aa8-kube-api-access-4fgf5\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149291 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-kubelet\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149348 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-var-lib-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-node-log\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-conf-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149455 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-multus-daemon-config\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cnibin\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-systemd-units\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149502 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.149756 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149721 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.150239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149908 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:34:52.150239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150089 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:34:52.150239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.149506 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-netns\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.150239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.150239 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150240 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-hostroot\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.150507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150255 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-multus-certs\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.150507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150269 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.150507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150282 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.150507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150371 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:34:52.150507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150487 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnk44\"" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150547 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r7zt2\"" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-env-overrides\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150648 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovn-node-metrics-cert\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-system-cni-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-os-release\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.150790 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150768 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150798 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-netd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150855 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrd55\" (UniqueName: \"kubernetes.io/projected/84c993c8-4dd2-40dc-b624-68a9f75a89cb-kube-api-access-wrd55\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150891 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdqj\" (UniqueName: \"kubernetes.io/projected/9696cbb9-a1db-4ead-914d-e2d11faa33b6-kube-api-access-djdqj\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-netns\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-bin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.150980 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-slash\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151005 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-config\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7f20dda8-0907-46fa-84c0-d1304b1105df-agent-certs\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.151126 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151104 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-kubelet\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151141 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-etc-kubernetes\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-script-lib\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-cni-binary-copy\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151238 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqfl\" (UniqueName: \"kubernetes.io/projected/67b8cec4-f05e-4ef7-9456-915dfa5c7554-kube-api-access-fdqfl\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151289 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-etc-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-bin\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.151713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.151511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.155628 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.155608 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.156118 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.156096 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.156207 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.156196 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8jkvt\"" Apr 23 16:34:52.156442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.156392 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:34:52.156623 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.156605 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:52.156716 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.156641 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.156716 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.156669 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:34:52.159653 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.159632 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:34:52.159901 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.159879 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:52.160100 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.160086 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9zqk6\"" Apr 23 16:34:52.160286 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.160272 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:52.199912 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.199873 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:51 +0000 UTC" deadline="2028-02-04 04:56:57.502693934 +0000 UTC" Apr 23 16:34:52.199912 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.199909 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15636h22m5.302787267s" Apr 23 16:34:52.239030 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.238959 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:34:52.251590 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251534 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-script-lib\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.251590 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqfl\" (UniqueName: \"kubernetes.io/projected/67b8cec4-f05e-4ef7-9456-915dfa5c7554-kube-api-access-fdqfl\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251620 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-sys-fs\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxnm\" (UniqueName: \"kubernetes.io/projected/325c692e-07b3-4dcc-984b-733489080887-kube-api-access-vsxnm\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-system-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-socket-dir-parent\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-ovn\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251738 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7f20dda8-0907-46fa-84c0-d1304b1105df-konnectivity-ca\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251762 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-kubernetes\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-systemd\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-socket-dir-parent\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251816 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-var-lib-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.251835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-system-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-node-log\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-node-log\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-netd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-netd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-ovn\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-var-lib-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.251995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrd55\" (UniqueName: \"kubernetes.io/projected/84c993c8-4dd2-40dc-b624-68a9f75a89cb-kube-api-access-wrd55\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-socket-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.252153 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252088 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-script-lib\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-multus-daemon-config\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7f20dda8-0907-46fa-84c0-d1304b1105df-konnectivity-ca\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252318 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cnibin\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252355 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-systemd-units\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-netns\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252396 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-systemd-units\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252357 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cnibin\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-env-overrides\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252463 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58db2d9c-607b-4549-8e61-e385991f3a16-iptables-alerter-script\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-run-netns\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-hostroot\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-multus-certs\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252517 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.252549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovn-node-metrics-cert\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252561 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-hostroot\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-registration-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-multus-certs\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysconfig\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-lib-modules\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252733 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-etc-tuned\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252759 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-multus-daemon-config\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-modprobe-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-conf\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252851 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-env-overrides\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252902 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252906 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/325c692e-07b3-4dcc-984b-733489080887-hosts-file\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.253015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.252908 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-serviceca\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-slash\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-config\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253089 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llj9s\" (UniqueName: \"kubernetes.io/projected/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kube-api-access-llj9s\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253113 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-slash\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qj9m\" (UniqueName: \"kubernetes.io/projected/f3054782-d344-49bd-865a-493a82cdebb1-kube-api-access-7qj9m\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253143 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-etc-kubernetes\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253201 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-cni-binary-copy\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-etc-kubernetes\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253247 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-etc-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253318 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-bin\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-device-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253394 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-host\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.253744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253439 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-systemd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-log-socket\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253469 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-run\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovnkube-config\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-etc-openvswitch\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253484 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-var-lib-kubelet\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-cnibin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-cni-bin\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253537 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-os-release\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-k8s-cni-cncf-io\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-os-release\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-multus\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgf5\" (UniqueName: \"kubernetes.io/projected/76787c69-1999-41dd-9713-d68801605aa8-kube-api-access-4fgf5\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253632 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-multus\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253674 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-k8s-cni-cncf-io\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76787c69-1999-41dd-9713-d68801605aa8-cni-binary-copy\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-cni-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.254497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253734 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-kubelet\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-host\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-cnibin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/325c692e-07b3-4dcc-984b-733489080887-tmp-dir\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253815 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-kubelet\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-conf-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253853 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-log-socket\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253858 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-etc-selinux\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253882 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-tmp\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-run-systemd\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.253929 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-multus-conf-dir\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58db2d9c-607b-4549-8e61-e385991f3a16-host-slash\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254126 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blzx4\" (UniqueName: \"kubernetes.io/projected/58db2d9c-607b-4549-8e61-e385991f3a16-kube-api-access-blzx4\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254150 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-system-cni-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254171 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-os-release\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.254197 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:52.255155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-sys\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-system-cni-dir\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djdqj\" (UniqueName: \"kubernetes.io/projected/9696cbb9-a1db-4ead-914d-e2d11faa33b6-kube-api-access-djdqj\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9696cbb9-a1db-4ead-914d-e2d11faa33b6-os-release\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-netns\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.254300 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:52.754266243 +0000 UTC m=+3.017052177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254306 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-run-netns\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254346 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-bin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254410 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7f20dda8-0907-46fa-84c0-d1304b1105df-agent-certs\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254436 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqflc\" (UniqueName: \"kubernetes.io/projected/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-kube-api-access-jqflc\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-kubelet\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-cni-bin\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254471 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84c993c8-4dd2-40dc-b624-68a9f75a89cb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254527 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9696cbb9-a1db-4ead-914d-e2d11faa33b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.255700 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.254616 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76787c69-1999-41dd-9713-d68801605aa8-host-var-lib-kubelet\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.256480 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.256463 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c993c8-4dd2-40dc-b624-68a9f75a89cb-ovn-node-metrics-cert\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.257334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.257288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7f20dda8-0907-46fa-84c0-d1304b1105df-agent-certs\") pod \"konnectivity-agent-cdgwc\" (UID: \"7f20dda8-0907-46fa-84c0-d1304b1105df\") " pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.262808 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.262785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrd55\" (UniqueName: \"kubernetes.io/projected/84c993c8-4dd2-40dc-b624-68a9f75a89cb-kube-api-access-wrd55\") pod \"ovnkube-node-xvn7t\" (UID: \"84c993c8-4dd2-40dc-b624-68a9f75a89cb\") " pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.264719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.264687 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqfl\" (UniqueName: \"kubernetes.io/projected/67b8cec4-f05e-4ef7-9456-915dfa5c7554-kube-api-access-fdqfl\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.264857 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.264833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgf5\" (UniqueName: \"kubernetes.io/projected/76787c69-1999-41dd-9713-d68801605aa8-kube-api-access-4fgf5\") pod \"multus-8rnmk\" (UID: \"76787c69-1999-41dd-9713-d68801605aa8\") " pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.265536 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.265512 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdqj\" (UniqueName: \"kubernetes.io/projected/9696cbb9-a1db-4ead-914d-e2d11faa33b6-kube-api-access-djdqj\") pod \"multus-additional-cni-plugins-58q6m\" (UID: \"9696cbb9-a1db-4ead-914d-e2d11faa33b6\") " pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.355774 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.355774 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355779 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-device-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-device-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-host\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.355980 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-run\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356005 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-var-lib-kubelet\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356016 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356015 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-host\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356033 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-host\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/325c692e-07b3-4dcc-984b-733489080887-tmp-dir\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356064 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-run\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356074 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-etc-selinux\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356097 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-tmp\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-host\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356139 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58db2d9c-607b-4549-8e61-e385991f3a16-host-slash\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-var-lib-kubelet\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blzx4\" (UniqueName: \"kubernetes.io/projected/58db2d9c-607b-4549-8e61-e385991f3a16-kube-api-access-blzx4\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:52.356241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356213 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-sys\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356239 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqflc\" (UniqueName: \"kubernetes.io/projected/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-kube-api-access-jqflc\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-sys-fs\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxnm\" (UniqueName: \"kubernetes.io/projected/325c692e-07b3-4dcc-984b-733489080887-kube-api-access-vsxnm\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-kubernetes\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356338 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-systemd\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356365 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-socket-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356389 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58db2d9c-607b-4549-8e61-e385991f3a16-iptables-alerter-script\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-registration-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysconfig\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356454 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-etc-selinux\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-lib-modules\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356510 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-etc-tuned\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-modprobe-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-conf\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/325c692e-07b3-4dcc-984b-733489080887-hosts-file\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.356675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356612 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-serviceca\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llj9s\" (UniqueName: \"kubernetes.io/projected/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kube-api-access-llj9s\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356656 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qj9m\" (UniqueName: \"kubernetes.io/projected/f3054782-d344-49bd-865a-493a82cdebb1-kube-api-access-7qj9m\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58db2d9c-607b-4549-8e61-e385991f3a16-host-slash\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.356954 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-registration-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysconfig\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357130 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357132 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/325c692e-07b3-4dcc-984b-733489080887-tmp-dir\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-lib-modules\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-systemd\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357420 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357334 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-socket-dir\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.357992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/325c692e-07b3-4dcc-984b-733489080887-hosts-file\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.357992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357746 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-modprobe-d\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.357992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357846 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58db2d9c-607b-4549-8e61-e385991f3a16-iptables-alerter-script\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.357992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.357873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-sysctl-conf\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.358182 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.358050 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-sys-fs\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.358182 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.358096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-sys\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.358182 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.358156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3054782-d344-49bd-865a-493a82cdebb1-etc-kubernetes\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.358320 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.358266 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-serviceca\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.359146 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.359098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-tmp\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.359659 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.359639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f3054782-d344-49bd-865a-493a82cdebb1-etc-tuned\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.364305 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.363540 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:52.364305 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.363592 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:52.364305 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.363627 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:52.364305 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.363696 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:34:52.863677166 +0000 UTC m=+3.126463095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:52.365871 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.365844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qj9m\" (UniqueName: \"kubernetes.io/projected/f3054782-d344-49bd-865a-493a82cdebb1-kube-api-access-7qj9m\") pod \"tuned-kwzbs\" (UID: \"f3054782-d344-49bd-865a-493a82cdebb1\") " pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.366332 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.366308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxnm\" (UniqueName: \"kubernetes.io/projected/325c692e-07b3-4dcc-984b-733489080887-kube-api-access-vsxnm\") pod \"node-resolver-78hbm\" (UID: \"325c692e-07b3-4dcc-984b-733489080887\") " pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.366467 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.366450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blzx4\" (UniqueName: \"kubernetes.io/projected/58db2d9c-607b-4549-8e61-e385991f3a16-kube-api-access-blzx4\") pod \"iptables-alerter-qg8fj\" (UID: \"58db2d9c-607b-4549-8e61-e385991f3a16\") " pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.367109 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.367091 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llj9s\" (UniqueName: \"kubernetes.io/projected/6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28-kube-api-access-llj9s\") pod \"aws-ebs-csi-driver-node-rmhqk\" (UID: \"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.367642 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.367626 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqflc\" (UniqueName: \"kubernetes.io/projected/95e238a6-f3c9-4b3a-a7de-1bec7cf6b287-kube-api-access-jqflc\") pod \"node-ca-rshv8\" (UID: \"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287\") " pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.446305 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.446256 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:34:52.455119 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.455091 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8rnmk" Apr 23 16:34:52.466013 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.465983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-58q6m" Apr 23 16:34:52.472791 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.472762 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" Apr 23 16:34:52.479633 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.479606 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:34:52.490227 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.490146 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" Apr 23 16:34:52.497938 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.497907 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78hbm" Apr 23 16:34:52.505612 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.505569 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rshv8" Apr 23 16:34:52.514313 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.514257 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qg8fj" Apr 23 16:34:52.760032 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.759929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:52.760191 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.760063 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:52.760191 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.760115 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:53.760100739 +0000 UTC m=+4.022886650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:52.961291 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:52.960931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:52.961291 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.960994 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58db2d9c_607b_4549_8e61_e385991f3a16.slice/crio-084f86d20321bd397e52b265f06d6ce8a0e74f2885347753ea68dbf5033f103b WatchSource:0}: Error finding container 084f86d20321bd397e52b265f06d6ce8a0e74f2885347753ea68dbf5033f103b: Status 404 returned error can't find the container with id 084f86d20321bd397e52b265f06d6ce8a0e74f2885347753ea68dbf5033f103b Apr 23 16:34:52.961384 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.961331 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:52.961384 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.961356 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:52.961384 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.961372 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:52.961558 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:52.961454 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:34:53.9614328 +0000 UTC m=+4.224218729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:52.963764 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.963723 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76787c69_1999_41dd_9713_d68801605aa8.slice/crio-2be7d775b7471e56a033ceb4e157ef0f8af0e9a97995ed324dd21990ff333724 WatchSource:0}: Error finding container 2be7d775b7471e56a033ceb4e157ef0f8af0e9a97995ed324dd21990ff333724: Status 404 returned error can't find the container with id 2be7d775b7471e56a033ceb4e157ef0f8af0e9a97995ed324dd21990ff333724 Apr 23 16:34:52.968362 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.968273 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3054782_d344_49bd_865a_493a82cdebb1.slice/crio-1a58232500d3acac782c865d0b4a3fb361d4320d8b4e57f81d3d38074e9db094 WatchSource:0}: Error finding container 1a58232500d3acac782c865d0b4a3fb361d4320d8b4e57f81d3d38074e9db094: Status 404 returned error can't find the container with id 1a58232500d3acac782c865d0b4a3fb361d4320d8b4e57f81d3d38074e9db094 Apr 23 16:34:52.969188 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.969152 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c993c8_4dd2_40dc_b624_68a9f75a89cb.slice/crio-9e8429d4e09f8d50bdc09f84b6103df3e6b87d29993eeeff81cbbd239995a6b4 WatchSource:0}: Error finding container 9e8429d4e09f8d50bdc09f84b6103df3e6b87d29993eeeff81cbbd239995a6b4: Status 404 returned error can't find the container with id 9e8429d4e09f8d50bdc09f84b6103df3e6b87d29993eeeff81cbbd239995a6b4 Apr 23 16:34:52.970152 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.970128 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e238a6_f3c9_4b3a_a7de_1bec7cf6b287.slice/crio-23bef95064051d849476fc99aaea04f5c53934ab048c266a2e084d7f7f34c92d WatchSource:0}: Error finding container 23bef95064051d849476fc99aaea04f5c53934ab048c266a2e084d7f7f34c92d: Status 404 returned error can't find the container with id 23bef95064051d849476fc99aaea04f5c53934ab048c266a2e084d7f7f34c92d Apr 23 16:34:52.973952 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.973883 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9696cbb9_a1db_4ead_914d_e2d11faa33b6.slice/crio-c69d1b3c4932ac0f9de0bdd365f9c929de68a6a7afead16d86884f7e73432a8e WatchSource:0}: Error finding container c69d1b3c4932ac0f9de0bdd365f9c929de68a6a7afead16d86884f7e73432a8e: Status 404 returned error can't find the container with id c69d1b3c4932ac0f9de0bdd365f9c929de68a6a7afead16d86884f7e73432a8e Apr 23 16:34:52.974140 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.974114 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d5505a1_8a5b_4a8e_8fd3_86a1feb40f28.slice/crio-567442941b22bcd5cf7d09f97080dac583048d19a842727c0894566e8925c5c4 WatchSource:0}: Error finding container 567442941b22bcd5cf7d09f97080dac583048d19a842727c0894566e8925c5c4: Status 404 returned error can't find the container with id 567442941b22bcd5cf7d09f97080dac583048d19a842727c0894566e8925c5c4 Apr 23 16:34:52.996023 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.995989 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f20dda8_0907_46fa_84c0_d1304b1105df.slice/crio-fbbf9ee92e493f1d3f5acd8c8273df019111050aca918d39ca273c22268aa692 WatchSource:0}: Error finding container fbbf9ee92e493f1d3f5acd8c8273df019111050aca918d39ca273c22268aa692: Status 404 returned error can't find the container with id fbbf9ee92e493f1d3f5acd8c8273df019111050aca918d39ca273c22268aa692 Apr 23 16:34:52.996885 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:34:52.996853 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod325c692e_07b3_4dcc_984b_733489080887.slice/crio-f127777684c27f33fa21fc0e61bf7d1b80b33641ab82faa6b86b0e4c51827c62 WatchSource:0}: Error finding container f127777684c27f33fa21fc0e61bf7d1b80b33641ab82faa6b86b0e4c51827c62: Status 404 returned error can't find the container with id f127777684c27f33fa21fc0e61bf7d1b80b33641ab82faa6b86b0e4c51827c62 Apr 23 16:34:53.200947 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.200907 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:51 +0000 UTC" deadline="2027-10-16 20:57:46.798697192 +0000 UTC" Apr 23 16:34:53.200947 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.200941 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12988h22m53.597759126s" Apr 23 16:34:53.298624 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.298518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" event={"ID":"6ca4686b7542221f742a8d14ace1d047","Type":"ContainerStarted","Data":"5b08beba44c37d6f7b3fe9e7fbe3fed623c1ff332ae0a76fc3a1b6ec84da8208"} Apr 23 16:34:53.301962 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.301931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cdgwc" event={"ID":"7f20dda8-0907-46fa-84c0-d1304b1105df","Type":"ContainerStarted","Data":"fbbf9ee92e493f1d3f5acd8c8273df019111050aca918d39ca273c22268aa692"} Apr 23 16:34:53.302995 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.302972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerStarted","Data":"c69d1b3c4932ac0f9de0bdd365f9c929de68a6a7afead16d86884f7e73432a8e"} Apr 23 16:34:53.303909 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.303890 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rshv8" event={"ID":"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287","Type":"ContainerStarted","Data":"23bef95064051d849476fc99aaea04f5c53934ab048c266a2e084d7f7f34c92d"} Apr 23 16:34:53.304869 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.304833 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78hbm" event={"ID":"325c692e-07b3-4dcc-984b-733489080887","Type":"ContainerStarted","Data":"f127777684c27f33fa21fc0e61bf7d1b80b33641ab82faa6b86b0e4c51827c62"} Apr 23 16:34:53.306380 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.306350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" event={"ID":"f3054782-d344-49bd-865a-493a82cdebb1","Type":"ContainerStarted","Data":"1a58232500d3acac782c865d0b4a3fb361d4320d8b4e57f81d3d38074e9db094"} Apr 23 16:34:53.307880 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.307862 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rnmk" event={"ID":"76787c69-1999-41dd-9713-d68801605aa8","Type":"ContainerStarted","Data":"2be7d775b7471e56a033ceb4e157ef0f8af0e9a97995ed324dd21990ff333724"} Apr 23 16:34:53.308742 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.308722 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"9e8429d4e09f8d50bdc09f84b6103df3e6b87d29993eeeff81cbbd239995a6b4"} Apr 23 16:34:53.309645 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.309624 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" event={"ID":"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28","Type":"ContainerStarted","Data":"567442941b22bcd5cf7d09f97080dac583048d19a842727c0894566e8925c5c4"} Apr 23 16:34:53.310620 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.310599 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qg8fj" event={"ID":"58db2d9c-607b-4549-8e61-e385991f3a16","Type":"ContainerStarted","Data":"084f86d20321bd397e52b265f06d6ce8a0e74f2885347753ea68dbf5033f103b"} Apr 23 16:34:53.319314 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.319267 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-4.ec2.internal" podStartSLOduration=2.319250657 podStartE2EDuration="2.319250657s" podCreationTimestamp="2026-04-23 16:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:53.318944371 +0000 UTC m=+3.581730305" watchObservedRunningTime="2026-04-23 16:34:53.319250657 +0000 UTC m=+3.582036591" Apr 23 16:34:53.767544 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.767507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:53.767722 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.767699 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:53.767777 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.767761 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.767742457 +0000 UTC m=+6.030528376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:53.968611 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:53.968539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:53.968837 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.968690 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:53.968837 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.968711 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:53.968837 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.968740 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:53.968837 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:53.968804 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.96878182 +0000 UTC m=+6.231567740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:54.280717 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:54.280628 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:54.281215 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:54.280777 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:34:54.281215 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:54.281206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:54.281326 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:54.281292 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:34:54.322782 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:54.322734 2580 generic.go:358] "Generic (PLEG): container finished" podID="96c9e4ad769a0a0462456abe60e537d4" containerID="88a5f3fba79178565b7203ab56373070acbc0d0e655fd6ece7eecd6779ebb291" exitCode=0 Apr 23 16:34:54.324064 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:54.323790 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" event={"ID":"96c9e4ad769a0a0462456abe60e537d4","Type":"ContainerDied","Data":"88a5f3fba79178565b7203ab56373070acbc0d0e655fd6ece7eecd6779ebb291"} Apr 23 16:34:55.079850 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.079785 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7lg5k"] Apr 23 16:34:55.083514 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.083488 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.083700 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.083600 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:34:55.179859 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.179650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-kubelet-config\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.179859 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.179701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.179859 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.179743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-dbus\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280315 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.280262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-kubelet-config\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280315 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.280311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280553 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.280350 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-dbus\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.280571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-dbus\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280670 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.280654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4b505881-5503-4e1f-b72b-0d8abde1a5e0-kubelet-config\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.280776 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.280750 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:55.281168 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.280805 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:55.780787518 +0000 UTC m=+6.043573443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:55.333677 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.333540 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" event={"ID":"96c9e4ad769a0a0462456abe60e537d4","Type":"ContainerStarted","Data":"de8f9e62b2fa6783067537164e30cdf8494422799fe0edfdf9dfe18c81aa0619"} Apr 23 16:34:55.786166 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.786127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:55.786387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.786231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:55.786387 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.786351 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:55.786557 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.786425 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:56.786395373 +0000 UTC m=+7.049181295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:55.786877 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.786856 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.786959 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.786915 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:59.786900399 +0000 UTC m=+10.049686311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:55.987473 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:55.987433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:55.987660 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.987614 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:55.987660 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.987632 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:55.987660 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.987643 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:55.987767 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:55.987704 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:34:59.987685663 +0000 UTC m=+10.250471578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:56.281952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:56.281918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:56.282386 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:56.282026 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:34:56.282435 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:56.282408 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:56.282529 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:56.282511 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:34:56.795128 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:56.795078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:56.795306 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:56.795279 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:56.795386 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:56.795349 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:58.795329952 +0000 UTC m=+9.058115869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:57.277884 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:57.277823 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:57.278060 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:57.277975 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:34:58.277826 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:58.277777 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:34:58.278291 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:58.277909 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:34:58.278363 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:58.278287 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:58.278467 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:58.278400 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:34:58.813001 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:58.812960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:58.813182 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:58.813115 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:58.813182 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:58.813179 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:02.81316041 +0000 UTC m=+13.075946324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:34:59.277860 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:59.277824 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:34:59.278377 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:59.277968 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:34:59.822440 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:34:59.822342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:34:59.822646 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:59.822501 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:59.822646 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:34:59.822564 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:07.822545203 +0000 UTC m=+18.085331114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:00.024286 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:00.024242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:00.024515 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.024396 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:00.024515 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.024413 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:00.024515 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.024425 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:00.024515 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.024481 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:08.024462842 +0000 UTC m=+18.287248760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:00.280749 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:00.279309 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:00.280749 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.279434 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:00.280749 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:00.279900 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:00.280749 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:00.279996 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:01.277787 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:01.277748 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:01.277975 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:01.277880 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:02.277740 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:02.277699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:02.277740 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:02.277741 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:02.278250 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:02.277841 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:02.278250 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:02.277943 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:02.846569 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:02.846522 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:02.846740 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:02.846671 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:02.846788 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:02.846742 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:10.846724866 +0000 UTC m=+21.109510782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:03.277952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:03.277917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:03.278375 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:03.278052 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:04.277887 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:04.277844 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:04.277887 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:04.277870 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:04.278412 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:04.278000 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:04.278412 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:04.278118 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:05.277733 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:05.277701 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:05.277925 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:05.277825 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:06.280100 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:06.280062 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:06.280488 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:06.280062 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:06.280488 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:06.280196 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:06.280488 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:06.280238 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:07.278336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:07.278297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:07.278507 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:07.278431 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:07.889134 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:07.889099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:07.889597 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:07.889276 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:07.889597 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:07.889344 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.889330213 +0000 UTC m=+34.152116128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:08.090737 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:08.090699 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:08.090938 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.090879 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:08.090938 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.090903 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:08.090938 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.090913 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.091099 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.090968 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.09095491 +0000 UTC m=+34.353740822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.280953 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:08.280922 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:08.281154 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:08.280927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:08.281154 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.281054 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:08.281154 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:08.281118 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:09.277562 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:09.277525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:09.278152 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:09.277666 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:10.277948 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:10.277907 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:10.278294 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:10.278014 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:10.278294 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:10.278059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:10.278294 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:10.278165 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:10.911135 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:10.910938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:10.911328 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:10.911079 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:10.911328 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:10.911304 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.911280363 +0000 UTC m=+37.174066286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:11.277994 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.277760 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:11.278832 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:11.278110 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:11.362062 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.361970 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78hbm" event={"ID":"325c692e-07b3-4dcc-984b-733489080887","Type":"ContainerStarted","Data":"c9adc61502497bde6b14c50c4e2efb20fbcb7b1c594b991e84773ea2156975e2"} Apr 23 16:35:11.363671 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.363635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" event={"ID":"f3054782-d344-49bd-865a-493a82cdebb1","Type":"ContainerStarted","Data":"bff96c76044b4770acf7673e48518cc19e96642373e97f198e7b76e6c3fc9f6b"} Apr 23 16:35:11.365382 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.365353 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8rnmk" event={"ID":"76787c69-1999-41dd-9713-d68801605aa8","Type":"ContainerStarted","Data":"db1bec0a7f47bf42338fadca681b8a8380f16f517d91a769a251eaa56ca5cdc3"} Apr 23 16:35:11.368188 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368170 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:35:11.368567 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368539 2580 generic.go:358] "Generic (PLEG): container finished" podID="84c993c8-4dd2-40dc-b624-68a9f75a89cb" containerID="597402b8595eeed17adec8b4f7075c1ad1f5ac0f702f22baf356fa937e1a856b" exitCode=1 Apr 23 16:35:11.368685 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368611 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"4eef4347f00a42964094bc04ed900a988404d7f678aee2f363f8313e0db2bbde"} Apr 23 16:35:11.368685 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368642 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"46e79c05309c16f1323d6ff02fc05e3e5ac64a6c50a954557d55fef923cc5948"} Apr 23 16:35:11.368685 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368657 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"47257b794e5cc6ccb3553db8e1d658f3a693d5dc049f07d7922805add85ac197"} Apr 23 16:35:11.368685 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368670 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"8d20e3d0cf451c1ae03d51b84ea8261fb7e64de97da820d26c8e199c40cf6c6d"} Apr 23 16:35:11.368685 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368682 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerDied","Data":"597402b8595eeed17adec8b4f7075c1ad1f5ac0f702f22baf356fa937e1a856b"} Apr 23 16:35:11.368835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.368696 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"023a0d0e67606c378078c9b5aa1cf3301ff2d985acafe8e212f2b54fc258be93"} Apr 23 16:35:11.370049 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.370023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" event={"ID":"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28","Type":"ContainerStarted","Data":"5ef4edcc23b8d71692b446b28891a10fe5ac52aed5e08ebd53264b299fbc1f04"} Apr 23 16:35:11.371724 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.371305 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cdgwc" event={"ID":"7f20dda8-0907-46fa-84c0-d1304b1105df","Type":"ContainerStarted","Data":"d231f6dfddc72f272e28c0fc125aa9af6b996be1ac13885a07835f56f87e05d6"} Apr 23 16:35:11.374143 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.374117 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="daa73a9c5a6ca1d5a69fcf7c11a6717834ed7ac65257ce2731d23b921c6863bc" exitCode=0 Apr 23 16:35:11.374238 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.374194 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"daa73a9c5a6ca1d5a69fcf7c11a6717834ed7ac65257ce2731d23b921c6863bc"} Apr 23 16:35:11.376179 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.376158 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rshv8" event={"ID":"95e238a6-f3c9-4b3a-a7de-1bec7cf6b287","Type":"ContainerStarted","Data":"f0d4e8d5ba585d959ba38f573cb7162f06cda79ce347719d63b63a5f0334c0d8"} Apr 23 16:35:11.379336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.379299 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-4.ec2.internal" podStartSLOduration=20.379287507 podStartE2EDuration="20.379287507s" podCreationTimestamp="2026-04-23 16:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:55.35469299 +0000 UTC m=+5.617478926" watchObservedRunningTime="2026-04-23 16:35:11.379287507 +0000 UTC m=+21.642073439" Apr 23 16:35:11.397091 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.397044 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-78hbm" podStartSLOduration=4.046844668 podStartE2EDuration="21.397029323s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.999045425 +0000 UTC m=+3.261831337" lastFinishedPulling="2026-04-23 16:35:10.349230014 +0000 UTC m=+20.612015992" observedRunningTime="2026-04-23 16:35:11.378907382 +0000 UTC m=+21.641693344" watchObservedRunningTime="2026-04-23 16:35:11.397029323 +0000 UTC m=+21.659815255" Apr 23 16:35:11.397366 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.397332 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8rnmk" podStartSLOduration=3.991462102 podStartE2EDuration="21.397325996s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.966793576 +0000 UTC m=+3.229579487" lastFinishedPulling="2026-04-23 16:35:10.372657455 +0000 UTC m=+20.635443381" observedRunningTime="2026-04-23 16:35:11.396812408 +0000 UTC m=+21.659598342" watchObservedRunningTime="2026-04-23 16:35:11.397325996 +0000 UTC m=+21.660111929" Apr 23 16:35:11.413190 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.413131 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cdgwc" podStartSLOduration=4.062947542 podStartE2EDuration="21.413116855s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.999059038 +0000 UTC m=+3.261844951" lastFinishedPulling="2026-04-23 16:35:10.349228346 +0000 UTC m=+20.612014264" observedRunningTime="2026-04-23 16:35:11.413066309 +0000 UTC m=+21.675852244" watchObservedRunningTime="2026-04-23 16:35:11.413116855 +0000 UTC m=+21.675902828" Apr 23 16:35:11.460516 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.460448 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kwzbs" podStartSLOduration=4.058249458 podStartE2EDuration="21.460428872s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.970962714 +0000 UTC m=+3.233748625" lastFinishedPulling="2026-04-23 16:35:10.373142128 +0000 UTC m=+20.635928039" observedRunningTime="2026-04-23 16:35:11.459674155 +0000 UTC m=+21.722460095" watchObservedRunningTime="2026-04-23 16:35:11.460428872 +0000 UTC m=+21.723214805" Apr 23 16:35:11.475860 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:11.475811 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rshv8" podStartSLOduration=12.09334719 podStartE2EDuration="21.475797499s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.994870851 +0000 UTC m=+3.257656762" lastFinishedPulling="2026-04-23 16:35:02.377321148 +0000 UTC m=+12.640107071" observedRunningTime="2026-04-23 16:35:11.475172735 +0000 UTC m=+21.737958669" watchObservedRunningTime="2026-04-23 16:35:11.475797499 +0000 UTC m=+21.738583432" Apr 23 16:35:12.229803 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.229772 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:12.267203 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.267077 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:12.229795181Z","UUID":"55bef2c3-6594-4411-a095-c48ea1fdc847","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:12.270050 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.269882 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:12.270050 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.270057 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:12.281414 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.281387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:12.281827 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.281391 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:12.281827 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:12.281533 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:12.281827 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:12.281616 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:12.380220 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.380182 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" event={"ID":"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28","Type":"ContainerStarted","Data":"3ab3c5c64026b7757f1d904e1dbeee88271954a15df6da99cf8936610ba75316"} Apr 23 16:35:12.381690 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.381656 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qg8fj" event={"ID":"58db2d9c-607b-4549-8e61-e385991f3a16","Type":"ContainerStarted","Data":"f95cf3115a6c2fbfd2d86a360f2e79c02b49bb562a9e9874e5fe598434ec3bbc"} Apr 23 16:35:12.398523 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:12.398422 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qg8fj" podStartSLOduration=5.012934116 podStartE2EDuration="22.398408332s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.963810295 +0000 UTC m=+3.226596205" lastFinishedPulling="2026-04-23 16:35:10.34928451 +0000 UTC m=+20.612070421" observedRunningTime="2026-04-23 16:35:12.398157509 +0000 UTC m=+22.660943443" watchObservedRunningTime="2026-04-23 16:35:12.398408332 +0000 UTC m=+22.661194264" Apr 23 16:35:13.278208 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:13.278172 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:13.278360 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:13.278308 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:14.277326 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.277291 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:14.277834 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:14.277392 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:14.277834 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.277294 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:14.277834 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:14.277794 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:14.388949 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.388917 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:35:14.389381 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.389350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"e4dc84138f5a035bb0102cf63e3f53f08eb38b4189af6755c2f9f81dbb1780db"} Apr 23 16:35:14.391351 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.391328 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" event={"ID":"6d5505a1-8a5b-4a8e-8fd3-86a1feb40f28","Type":"ContainerStarted","Data":"1b9ea8ae88f724841e1690549465fc22be4b2b8ff71329c626bb3b1bbbbed683"} Apr 23 16:35:14.416278 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:14.416219 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rmhqk" podStartSLOduration=3.679601866 podStartE2EDuration="24.41619839s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.994781678 +0000 UTC m=+3.257567588" lastFinishedPulling="2026-04-23 16:35:13.731378187 +0000 UTC m=+23.994164112" observedRunningTime="2026-04-23 16:35:14.415922213 +0000 UTC m=+24.678708147" watchObservedRunningTime="2026-04-23 16:35:14.41619839 +0000 UTC m=+24.678984327" Apr 23 16:35:15.277542 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:15.277511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:15.278076 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:15.277633 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:16.034023 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.033984 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:35:16.035069 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.035046 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:35:16.277652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.277618 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:16.277652 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.277642 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:16.278137 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:16.277768 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:16.278137 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:16.277912 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:16.395061 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.394981 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:35:16.395524 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:16.395506 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cdgwc" Apr 23 16:35:17.278079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.277881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:17.278553 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:17.278158 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:17.398104 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.398067 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="88f151962a2c75b43a2227637cb1b964daafcfb8bfc8d6f5395aedc849633045" exitCode=0 Apr 23 16:35:17.398300 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.398146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"88f151962a2c75b43a2227637cb1b964daafcfb8bfc8d6f5395aedc849633045"} Apr 23 16:35:17.401340 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.401320 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:35:17.401731 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.401708 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"1ec587a00a73fef95d1e9ee4e93d8f7b2eacb6efde8cf1749e927f0f7c85aadb"} Apr 23 16:35:17.401887 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.401875 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:17.401939 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.401893 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:17.402004 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.401983 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:17.402139 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.402122 2580 scope.go:117] "RemoveContainer" containerID="597402b8595eeed17adec8b4f7075c1ad1f5ac0f702f22baf356fa937e1a856b" Apr 23 16:35:17.418409 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.418387 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:17.423059 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:17.423041 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:18.279861 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.279836 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:18.280222 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.279836 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:18.280222 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:18.279959 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:18.280222 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:18.279999 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:18.405183 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.405148 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="9bf38f050f0b97cad21a1b79f5cfa6af58cf30836b82af76275887a2b7ac7273" exitCode=0 Apr 23 16:35:18.405344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.405235 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"9bf38f050f0b97cad21a1b79f5cfa6af58cf30836b82af76275887a2b7ac7273"} Apr 23 16:35:18.408740 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.408722 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:35:18.409149 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.409125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" event={"ID":"84c993c8-4dd2-40dc-b624-68a9f75a89cb","Type":"ContainerStarted","Data":"872efe163e1f400a1e7a277af824fb8047f683cdf5a7d4864d0e0e14b486bb76"} Apr 23 16:35:18.469451 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:18.469406 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" podStartSLOduration=11.008602404 podStartE2EDuration="28.469390206s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.972383381 +0000 UTC m=+3.235169293" lastFinishedPulling="2026-04-23 16:35:10.43317117 +0000 UTC m=+20.695957095" observedRunningTime="2026-04-23 16:35:18.465713176 +0000 UTC m=+28.728499110" watchObservedRunningTime="2026-04-23 16:35:18.469390206 +0000 UTC m=+28.732176117" Apr 23 16:35:19.277699 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:19.277612 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:19.277823 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:19.277719 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:19.413000 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:19.412968 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="dde6b3acb0d17a9b03535a6fe6367150de318cfa9c06f28d74dcdd023d947a55" exitCode=0 Apr 23 16:35:19.413349 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:19.413054 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"dde6b3acb0d17a9b03535a6fe6367150de318cfa9c06f28d74dcdd023d947a55"} Apr 23 16:35:20.278328 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:20.278297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:20.278528 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:20.278413 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:20.278528 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:20.278471 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:20.278660 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:20.278552 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:21.278317 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:21.278141 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:21.278697 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:21.278391 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:22.278187 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:22.278149 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:22.278387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:22.278156 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:22.278387 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:22.278253 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:22.278387 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:22.278365 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:23.277693 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:23.277658 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:23.277848 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:23.277769 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:23.906277 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:23.906232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:23.906713 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:23.906360 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.906713 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:23.906426 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs podName:67b8cec4-f05e-4ef7-9456-915dfa5c7554 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:55.906407413 +0000 UTC m=+66.169193333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs") pod "network-metrics-daemon-f889w" (UID: "67b8cec4-f05e-4ef7-9456-915dfa5c7554") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:24.108096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.108046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:24.108270 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.108219 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:24.108270 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.108236 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:24.108270 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.108246 2580 projected.go:194] Error preparing data for projected volume kube-api-access-2fcvz for pod openshift-network-diagnostics/network-check-target-7bn2z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.108438 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.108301 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz podName:38d83fc0-30d4-48d7-8aee-f7afaa404c2e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:56.108287738 +0000 UTC m=+66.371073653 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2fcvz" (UniqueName: "kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz") pod "network-check-target-7bn2z" (UID: "38d83fc0-30d4-48d7-8aee-f7afaa404c2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.278706 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.278667 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:24.278881 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.278802 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:24.278936 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.278915 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:24.279034 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.279014 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:24.510645 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.510571 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7lg5k"] Apr 23 16:35:24.511550 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.511519 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7bn2z"] Apr 23 16:35:24.511708 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.511649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:24.511766 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.511752 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:24.512456 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.512206 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f889w"] Apr 23 16:35:24.512456 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.512308 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:24.512456 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.512419 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:24.514461 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:24.514301 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:24.514461 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:24.514423 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:26.277920 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:26.277879 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:26.278820 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:26.277879 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:26.278820 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:26.278016 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:26.278820 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:26.277879 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:26.278820 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:26.278090 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:26.278820 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:26.278169 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:26.930369 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:26.930135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:26.930553 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:26.930287 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:26.930553 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:26.930447 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret podName:4b505881-5503-4e1f-b72b-0d8abde1a5e0 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:58.930429183 +0000 UTC m=+69.193215105 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret") pod "global-pull-secret-syncer-7lg5k" (UID: "4b505881-5503-4e1f-b72b-0d8abde1a5e0") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:27.430866 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:27.430833 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="b3467e6f6a80b7aae0a9d59befd34897089bf54ad610f2948dd18bd5debfbf93" exitCode=0 Apr 23 16:35:27.431225 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:27.430903 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"b3467e6f6a80b7aae0a9d59befd34897089bf54ad610f2948dd18bd5debfbf93"} Apr 23 16:35:28.278195 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.278160 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:28.278348 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.278160 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:28.278348 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:28.278274 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f889w" podUID="67b8cec4-f05e-4ef7-9456-915dfa5c7554" Apr 23 16:35:28.278419 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:28.278362 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7lg5k" podUID="4b505881-5503-4e1f-b72b-0d8abde1a5e0" Apr 23 16:35:28.278419 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.278160 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:28.278479 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:28.278463 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7bn2z" podUID="38d83fc0-30d4-48d7-8aee-f7afaa404c2e" Apr 23 16:35:28.435743 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.435654 2580 generic.go:358] "Generic (PLEG): container finished" podID="9696cbb9-a1db-4ead-914d-e2d11faa33b6" containerID="93746b1d6cc99eb9ea4374542c39a6d01f6cdfda9d2b65effce999a29b8ed221" exitCode=0 Apr 23 16:35:28.435743 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.435713 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerDied","Data":"93746b1d6cc99eb9ea4374542c39a6d01f6cdfda9d2b65effce999a29b8ed221"} Apr 23 16:35:28.595170 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.595147 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-4.ec2.internal" event="NodeReady" Apr 23 16:35:28.595292 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.595282 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:28.655815 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.655783 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vsd8g"] Apr 23 16:35:28.681971 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.681931 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pmxx2"] Apr 23 16:35:28.682142 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.682116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.687104 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.687043 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tz9n7\"" Apr 23 16:35:28.687262 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.687232 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:28.687351 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.687333 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:28.704891 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.704851 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pmxx2"] Apr 23 16:35:28.705053 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.704912 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vsd8g"] Apr 23 16:35:28.705053 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.704929 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d4d84"] Apr 23 16:35:28.705053 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.704970 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.708607 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.708565 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:35:28.708996 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.708977 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:35:28.709473 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.709452 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.709473 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.709471 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nzf9q\"" Apr 23 16:35:28.709642 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.709549 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.734048 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.734021 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d4d84"] Apr 23 16:35:28.734197 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.734138 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.737254 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.737233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nghfn\"" Apr 23 16:35:28.737254 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.737237 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:28.737447 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.737396 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:28.737487 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.737467 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:28.845723 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklpc\" (UniqueName: \"kubernetes.io/projected/2846178e-072d-415f-9774-a498aa844964-kube-api-access-fklpc\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.845723 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845725 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2846178e-072d-415f-9774-a498aa844964-data-volume\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845757 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cda3ba79-1ac0-44e6-99b3-60c2f4882479-tmp-dir\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45de36f-3b6e-4c0e-a301-4fa4d410f228-cert\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845809 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cda3ba79-1ac0-44e6-99b3-60c2f4882479-config-volume\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845839 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfhx\" (UniqueName: \"kubernetes.io/projected/b45de36f-3b6e-4c0e-a301-4fa4d410f228-kube-api-access-4vfhx\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2846178e-072d-415f-9774-a498aa844964-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845882 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7vk\" (UniqueName: \"kubernetes.io/projected/cda3ba79-1ac0-44e6-99b3-60c2f4882479-kube-api-access-sw7vk\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2846178e-072d-415f-9774-a498aa844964-crio-socket\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.845984 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cda3ba79-1ac0-44e6-99b3-60c2f4882479-metrics-tls\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.846367 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.845994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2846178e-072d-415f-9774-a498aa844964-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947083 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947005 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklpc\" (UniqueName: \"kubernetes.io/projected/2846178e-072d-415f-9774-a498aa844964-kube-api-access-fklpc\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947083 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2846178e-072d-415f-9774-a498aa844964-data-volume\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947306 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cda3ba79-1ac0-44e6-99b3-60c2f4882479-tmp-dir\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.947306 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947252 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45de36f-3b6e-4c0e-a301-4fa4d410f228-cert\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.947306 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cda3ba79-1ac0-44e6-99b3-60c2f4882479-config-volume\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947310 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfhx\" (UniqueName: \"kubernetes.io/projected/b45de36f-3b6e-4c0e-a301-4fa4d410f228-kube-api-access-4vfhx\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947315 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2846178e-072d-415f-9774-a498aa844964-data-volume\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947336 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2846178e-072d-415f-9774-a498aa844964-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947374 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7vk\" (UniqueName: \"kubernetes.io/projected/cda3ba79-1ac0-44e6-99b3-60c2f4882479-kube-api-access-sw7vk\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2846178e-072d-415f-9774-a498aa844964-crio-socket\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cda3ba79-1ac0-44e6-99b3-60c2f4882479-metrics-tls\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.947807 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2846178e-072d-415f-9774-a498aa844964-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947807 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2846178e-072d-415f-9774-a498aa844964-crio-socket\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.947904 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cda3ba79-1ac0-44e6-99b3-60c2f4882479-tmp-dir\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.947950 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.947935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2846178e-072d-415f-9774-a498aa844964-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.948151 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.948127 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cda3ba79-1ac0-44e6-99b3-60c2f4882479-config-volume\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.951410 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.951383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cda3ba79-1ac0-44e6-99b3-60c2f4882479-metrics-tls\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.951505 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.951391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2846178e-072d-415f-9774-a498aa844964-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.951552 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.951516 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45de36f-3b6e-4c0e-a301-4fa4d410f228-cert\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.957043 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.957013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklpc\" (UniqueName: \"kubernetes.io/projected/2846178e-072d-415f-9774-a498aa844964-kube-api-access-fklpc\") pod \"insights-runtime-extractor-pmxx2\" (UID: \"2846178e-072d-415f-9774-a498aa844964\") " pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:28.958391 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.958366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7vk\" (UniqueName: \"kubernetes.io/projected/cda3ba79-1ac0-44e6-99b3-60c2f4882479-kube-api-access-sw7vk\") pod \"dns-default-vsd8g\" (UID: \"cda3ba79-1ac0-44e6-99b3-60c2f4882479\") " pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:28.958507 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.958491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfhx\" (UniqueName: \"kubernetes.io/projected/b45de36f-3b6e-4c0e-a301-4fa4d410f228-kube-api-access-4vfhx\") pod \"ingress-canary-d4d84\" (UID: \"b45de36f-3b6e-4c0e-a301-4fa4d410f228\") " pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:28.992543 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:28.992501 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:29.021365 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.021330 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pmxx2" Apr 23 16:35:29.042186 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.042152 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d4d84" Apr 23 16:35:29.179320 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.179290 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vsd8g"] Apr 23 16:35:29.193767 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:29.193730 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda3ba79_1ac0_44e6_99b3_60c2f4882479.slice/crio-18acddcb4bf8b677c610318f5a6c4bae6cc5945a0f085c45ef1c57aee568d65a WatchSource:0}: Error finding container 18acddcb4bf8b677c610318f5a6c4bae6cc5945a0f085c45ef1c57aee568d65a: Status 404 returned error can't find the container with id 18acddcb4bf8b677c610318f5a6c4bae6cc5945a0f085c45ef1c57aee568d65a Apr 23 16:35:29.194651 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.194535 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pmxx2"] Apr 23 16:35:29.207023 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.206999 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d4d84"] Apr 23 16:35:29.210274 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:29.210249 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45de36f_3b6e_4c0e_a301_4fa4d410f228.slice/crio-c62f11a67ad605ac51d143144373b8d0eb3b342addea37ffa45f9aa9fa5011a8 WatchSource:0}: Error finding container c62f11a67ad605ac51d143144373b8d0eb3b342addea37ffa45f9aa9fa5011a8: Status 404 returned error can't find the container with id c62f11a67ad605ac51d143144373b8d0eb3b342addea37ffa45f9aa9fa5011a8 Apr 23 16:35:29.439204 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.439171 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pmxx2" event={"ID":"2846178e-072d-415f-9774-a498aa844964","Type":"ContainerStarted","Data":"98ee70e7f7d816ae971a3786326cbe968a73d48d7182f50ec076b6f085971207"} Apr 23 16:35:29.439204 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.439211 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pmxx2" event={"ID":"2846178e-072d-415f-9774-a498aa844964","Type":"ContainerStarted","Data":"8734e9f2754c0c3a5c498232c105ab5f76e5120e7ec6595db1a77dd2a10af020"} Apr 23 16:35:29.440248 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.440222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vsd8g" event={"ID":"cda3ba79-1ac0-44e6-99b3-60c2f4882479","Type":"ContainerStarted","Data":"18acddcb4bf8b677c610318f5a6c4bae6cc5945a0f085c45ef1c57aee568d65a"} Apr 23 16:35:29.443232 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.443209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-58q6m" event={"ID":"9696cbb9-a1db-4ead-914d-e2d11faa33b6","Type":"ContainerStarted","Data":"f8777caaad23eb140f4aeaf128cce2e039c1d0a66b66bb1104b577ff491449b1"} Apr 23 16:35:29.444221 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.444190 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d4d84" event={"ID":"b45de36f-3b6e-4c0e-a301-4fa4d410f228","Type":"ContainerStarted","Data":"c62f11a67ad605ac51d143144373b8d0eb3b342addea37ffa45f9aa9fa5011a8"} Apr 23 16:35:29.473070 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:29.472990 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-58q6m" podStartSLOduration=5.899684153 podStartE2EDuration="39.472975094s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:34:52.99478004 +0000 UTC m=+3.257565954" lastFinishedPulling="2026-04-23 16:35:26.568070967 +0000 UTC m=+36.830856895" observedRunningTime="2026-04-23 16:35:29.470945096 +0000 UTC m=+39.733731029" watchObservedRunningTime="2026-04-23 16:35:29.472975094 +0000 UTC m=+39.735761026" Apr 23 16:35:30.284334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.284296 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:30.284512 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.284399 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:30.284563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.284513 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:30.287899 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.287871 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:30.288071 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.288051 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:30.289124 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.289106 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:30.289124 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.289116 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4csk9\"" Apr 23 16:35:30.289373 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.289352 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vvm92\"" Apr 23 16:35:30.289694 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:30.289677 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:31.108214 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.108183 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8ql4"] Apr 23 16:35:31.131995 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.131953 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pvhzm"] Apr 23 16:35:31.132178 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.132156 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.135335 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.135307 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:35:31.135857 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.135835 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:35:31.136117 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.136098 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 16:35:31.136173 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.136158 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 16:35:31.136289 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.136106 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4mw8c\"" Apr 23 16:35:31.136330 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.136307 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:35:31.136470 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.136457 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:35:31.153333 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.153299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8ql4"] Apr 23 16:35:31.153503 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.153482 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.155959 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.155934 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:35:31.156374 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.156358 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:35:31.157113 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.157094 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:35:31.157229 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.157183 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g4jgc\"" Apr 23 16:35:31.270902 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.270864 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-root\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.270925 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.270953 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.270987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271022 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-wtmp\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-metrics-client-ca\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271105 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zgk\" (UniqueName: \"kubernetes.io/projected/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-api-access-k5zgk\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271159 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-sys\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271195 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271238 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271265 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-textfile\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271289 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbkr\" (UniqueName: \"kubernetes.io/projected/f8cb88e9-6f22-4927-807b-b213102a45ed-kube-api-access-2jbkr\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271387 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271342 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.271820 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.271391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.372705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-root\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-root\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372711 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372820 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-wtmp\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-metrics-client-ca\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372902 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zgk\" (UniqueName: \"kubernetes.io/projected/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-api-access-k5zgk\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372918 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-sys\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372957 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.372981 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.372982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-textfile\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373016 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jbkr\" (UniqueName: \"kubernetes.io/projected/f8cb88e9-6f22-4927-807b-b213102a45ed-kube-api-access-2jbkr\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373089 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-sys\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373344 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373292 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-wtmp\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373674 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373422 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373674 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:31.373659 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 16:35:31.373781 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:31.373735 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls podName:f8cb88e9-6f22-4927-807b-b213102a45ed nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.873716007 +0000 UTC m=+42.136501918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls") pod "node-exporter-pvhzm" (UID: "f8cb88e9-6f22-4927-807b-b213102a45ed") : secret "node-exporter-tls" not found Apr 23 16:35:31.373851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.373834 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8cb88e9-6f22-4927-807b-b213102a45ed-metrics-client-ca\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.373920 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:31.373898 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 16:35:31.374044 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:31.373949 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls podName:4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.873935466 +0000 UTC m=+42.136721380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-q8ql4" (UID: "4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2") : secret "kube-state-metrics-tls" not found Apr 23 16:35:31.374634 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.374560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.374634 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.374615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.374830 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.374807 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-textfile\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.375516 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.375493 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.375827 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.375808 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.383267 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.383236 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jbkr\" (UniqueName: \"kubernetes.io/projected/f8cb88e9-6f22-4927-807b-b213102a45ed-kube-api-access-2jbkr\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.383381 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.383338 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zgk\" (UniqueName: \"kubernetes.io/projected/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-api-access-k5zgk\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.385362 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.385339 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.450178 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.450141 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pmxx2" event={"ID":"2846178e-072d-415f-9774-a498aa844964","Type":"ContainerStarted","Data":"8a759fd36183883ff7ae6e43ab36b8021b404e353e9bfaebf9c9b72b552767fe"} Apr 23 16:35:31.877110 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.877077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.877282 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.877200 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:31.879649 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.879618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8cb88e9-6f22-4927-807b-b213102a45ed-node-exporter-tls\") pod \"node-exporter-pvhzm\" (UID: \"f8cb88e9-6f22-4927-807b-b213102a45ed\") " pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:31.880303 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:31.880280 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-q8ql4\" (UID: \"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:32.048488 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.048419 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" Apr 23 16:35:32.064294 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.064262 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pvhzm" Apr 23 16:35:32.073769 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:32.073723 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8cb88e9_6f22_4927_807b_b213102a45ed.slice/crio-504931c337bfe69885deab4e434af6d7bbbf32a7c0d9444651b732332f00d1b9 WatchSource:0}: Error finding container 504931c337bfe69885deab4e434af6d7bbbf32a7c0d9444651b732332f00d1b9: Status 404 returned error can't find the container with id 504931c337bfe69885deab4e434af6d7bbbf32a7c0d9444651b732332f00d1b9 Apr 23 16:35:32.216423 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.216386 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-q8ql4"] Apr 23 16:35:32.222709 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:32.222675 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abd8a2b_c2da_41d7_8806_7ccb0fdbeae2.slice/crio-74b67cb9864ac041fa609abb520bb3ee2d8bcd35a2d3bc57df369acf3307121c WatchSource:0}: Error finding container 74b67cb9864ac041fa609abb520bb3ee2d8bcd35a2d3bc57df369acf3307121c: Status 404 returned error can't find the container with id 74b67cb9864ac041fa609abb520bb3ee2d8bcd35a2d3bc57df369acf3307121c Apr 23 16:35:32.455241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.455168 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vsd8g" event={"ID":"cda3ba79-1ac0-44e6-99b3-60c2f4882479","Type":"ContainerStarted","Data":"226eb556d831c810ced06f10fb951216a7ecc4dabc2e1f984d1fa05bafc4b0d5"} Apr 23 16:35:32.455241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.455215 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vsd8g" event={"ID":"cda3ba79-1ac0-44e6-99b3-60c2f4882479","Type":"ContainerStarted","Data":"f4c53c84670312e4ee0558a2fccae43a90d9e6412465700d67b5993aae86a10f"} Apr 23 16:35:32.455492 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.455323 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:32.456375 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.456350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pvhzm" event={"ID":"f8cb88e9-6f22-4927-807b-b213102a45ed","Type":"ContainerStarted","Data":"504931c337bfe69885deab4e434af6d7bbbf32a7c0d9444651b732332f00d1b9"} Apr 23 16:35:32.457620 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.457566 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" event={"ID":"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2","Type":"ContainerStarted","Data":"74b67cb9864ac041fa609abb520bb3ee2d8bcd35a2d3bc57df369acf3307121c"} Apr 23 16:35:32.458928 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.458902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d4d84" event={"ID":"b45de36f-3b6e-4c0e-a301-4fa4d410f228","Type":"ContainerStarted","Data":"67cbebc6cc5ed01638eae9f7af807eb64188ec434cb59a3ad2e4b6c43e769a39"} Apr 23 16:35:32.482084 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.481965 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vsd8g" podStartSLOduration=2.007622525 podStartE2EDuration="4.481914969s" podCreationTimestamp="2026-04-23 16:35:28 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.196074821 +0000 UTC m=+39.458860732" lastFinishedPulling="2026-04-23 16:35:31.670367261 +0000 UTC m=+41.933153176" observedRunningTime="2026-04-23 16:35:32.481106325 +0000 UTC m=+42.743892261" watchObservedRunningTime="2026-04-23 16:35:32.481914969 +0000 UTC m=+42.744700903" Apr 23 16:35:32.506309 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:32.506254 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d4d84" podStartSLOduration=2.052486368 podStartE2EDuration="4.50623852s" podCreationTimestamp="2026-04-23 16:35:28 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.212198947 +0000 UTC m=+39.474984859" lastFinishedPulling="2026-04-23 16:35:31.665951089 +0000 UTC m=+41.928737011" observedRunningTime="2026-04-23 16:35:32.505464919 +0000 UTC m=+42.768250866" watchObservedRunningTime="2026-04-23 16:35:32.50623852 +0000 UTC m=+42.769024447" Apr 23 16:35:33.466829 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:33.466787 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pmxx2" event={"ID":"2846178e-072d-415f-9774-a498aa844964","Type":"ContainerStarted","Data":"780514ca5179711ed3a4070bf0ac64cd505da95eb49bc3c36f5ac41b7ce619c5"} Apr 23 16:35:33.491157 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:33.491108 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pmxx2" podStartSLOduration=1.6342689350000001 podStartE2EDuration="5.491092197s" podCreationTimestamp="2026-04-23 16:35:28 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.310042137 +0000 UTC m=+39.572828052" lastFinishedPulling="2026-04-23 16:35:33.166865398 +0000 UTC m=+43.429651314" observedRunningTime="2026-04-23 16:35:33.490921642 +0000 UTC m=+43.753707576" watchObservedRunningTime="2026-04-23 16:35:33.491092197 +0000 UTC m=+43.753878124" Apr 23 16:35:34.235324 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.235290 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-d749c5fc6-5sfqd"] Apr 23 16:35:34.265167 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.265132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d749c5fc6-5sfqd"] Apr 23 16:35:34.265312 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.265249 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.268776 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.268754 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 16:35:34.269431 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269408 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-b2hl2\"" Apr 23 16:35:34.269563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269407 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 16:35:34.269563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269472 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-e9se91v71rd\"" Apr 23 16:35:34.269563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269505 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 16:35:34.269563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269548 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 16:35:34.269917 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.269900 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 16:35:34.298269 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-grpc-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298269 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298270 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wf77\" (UniqueName: \"kubernetes.io/projected/7e9f2050-7154-45e4-8941-45715943a2f9-kube-api-access-5wf77\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298509 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298509 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298509 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298509 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298696 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9f2050-7154-45e4-8941-45715943a2f9-metrics-client-ca\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.298696 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.298612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399716 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399822 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-grpc-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399822 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wf77\" (UniqueName: \"kubernetes.io/projected/7e9f2050-7154-45e4-8941-45715943a2f9-kube-api-access-5wf77\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399822 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399822 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399987 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399863 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399987 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399896 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.399987 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.399930 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9f2050-7154-45e4-8941-45715943a2f9-metrics-client-ca\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.400941 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.400910 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9f2050-7154-45e4-8941-45715943a2f9-metrics-client-ca\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.402662 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.402638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.402740 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.402646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.402846 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.402828 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.403148 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.403118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-grpc-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.403148 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.403131 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-tls\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.413943 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.413920 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e9f2050-7154-45e4-8941-45715943a2f9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.428220 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.428194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wf77\" (UniqueName: \"kubernetes.io/projected/7e9f2050-7154-45e4-8941-45715943a2f9-kube-api-access-5wf77\") pod \"thanos-querier-d749c5fc6-5sfqd\" (UID: \"7e9f2050-7154-45e4-8941-45715943a2f9\") " pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.446474 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.446446 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7zcwv"] Apr 23 16:35:34.469331 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.469291 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7zcwv"] Apr 23 16:35:34.469797 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.469437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:34.471651 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.471620 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pvhzm" event={"ID":"f8cb88e9-6f22-4927-807b-b213102a45ed","Type":"ContainerStarted","Data":"cc100d4149ea566bb271d359abe786e19a525e09551735e5ef3efdb74212eb66"} Apr 23 16:35:34.473303 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.473284 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:35:34.473406 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.473348 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:35:34.473854 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.473835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-kgptm\"" Apr 23 16:35:34.501090 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.500981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blhs\" (UniqueName: \"kubernetes.io/projected/e2859776-cf83-49ed-ac30-69c2ec6863e5-kube-api-access-9blhs\") pod \"downloads-6bcc868b7-7zcwv\" (UID: \"e2859776-cf83-49ed-ac30-69c2ec6863e5\") " pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:34.574329 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.574297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:34.601973 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.601932 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9blhs\" (UniqueName: \"kubernetes.io/projected/e2859776-cf83-49ed-ac30-69c2ec6863e5-kube-api-access-9blhs\") pod \"downloads-6bcc868b7-7zcwv\" (UID: \"e2859776-cf83-49ed-ac30-69c2ec6863e5\") " pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:34.611741 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.611715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blhs\" (UniqueName: \"kubernetes.io/projected/e2859776-cf83-49ed-ac30-69c2ec6863e5-kube-api-access-9blhs\") pod \"downloads-6bcc868b7-7zcwv\" (UID: \"e2859776-cf83-49ed-ac30-69c2ec6863e5\") " pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:34.779397 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.779363 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:34.925933 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.925899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d749c5fc6-5sfqd"] Apr 23 16:35:34.943747 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:34.943727 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7zcwv"] Apr 23 16:35:35.438137 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.438060 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c7c78bc5d-5knsj"] Apr 23 16:35:35.470185 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.470139 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c7c78bc5d-5knsj"] Apr 23 16:35:35.470616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.470292 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.474142 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.474120 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 16:35:35.474299 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.474243 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 16:35:35.474647 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.474629 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bqqvofg61ckv4\"" Apr 23 16:35:35.474737 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.474629 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:35:35.475616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.475065 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 16:35:35.475616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.475137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-d4wpz\"" Apr 23 16:35:35.477188 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.477153 2580 generic.go:358] "Generic (PLEG): container finished" podID="f8cb88e9-6f22-4927-807b-b213102a45ed" containerID="cc100d4149ea566bb271d359abe786e19a525e09551735e5ef3efdb74212eb66" exitCode=0 Apr 23 16:35:35.477311 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.477242 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pvhzm" event={"ID":"f8cb88e9-6f22-4927-807b-b213102a45ed","Type":"ContainerDied","Data":"cc100d4149ea566bb271d359abe786e19a525e09551735e5ef3efdb74212eb66"} Apr 23 16:35:35.480010 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.479970 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" event={"ID":"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2","Type":"ContainerStarted","Data":"9699f3ac8e478d09d3215e69c3eaf7c09b276caccdf2cf12edbfadbf28306910"} Apr 23 16:35:35.480010 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.480009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" event={"ID":"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2","Type":"ContainerStarted","Data":"70f0508343f921f40693fe63fc27e1ea0a8a62de4cbf91dbab6b56967f370fad"} Apr 23 16:35:35.480135 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.480024 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" event={"ID":"4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2","Type":"ContainerStarted","Data":"c9ce6792b80602fee6cc3eeeb2ab82c443a40fc2e9262f410b49c4c6bc2c02de"} Apr 23 16:35:35.481247 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.481225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7zcwv" event={"ID":"e2859776-cf83-49ed-ac30-69c2ec6863e5","Type":"ContainerStarted","Data":"120ad1560729b459a112dfa9ff09cb2112d2fd72ade4b196d17a4abf96a9a6b3"} Apr 23 16:35:35.482245 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.482225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"6bc8e61d52ef0a9aa0e86f5f214a306a2a8e2979fdae2f4718f1e5d76c7d55f1"} Apr 23 16:35:35.509570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-client-certs\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-tls\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509678 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-metrics-server-audit-profiles\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509878 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e32dc24f-77a9-4c9d-a568-3e7866f08632-audit-log\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509878 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509853 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-client-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.509967 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.509933 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zp6n\" (UniqueName: \"kubernetes.io/projected/e32dc24f-77a9-4c9d-a568-3e7866f08632-kube-api-access-4zp6n\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.519433 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.519381 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-q8ql4" podStartSLOduration=1.969934378 podStartE2EDuration="4.519359896s" podCreationTimestamp="2026-04-23 16:35:31 +0000 UTC" firstStartedPulling="2026-04-23 16:35:32.225229449 +0000 UTC m=+42.488015369" lastFinishedPulling="2026-04-23 16:35:34.774654971 +0000 UTC m=+45.037440887" observedRunningTime="2026-04-23 16:35:35.517763361 +0000 UTC m=+45.780549296" watchObservedRunningTime="2026-04-23 16:35:35.519359896 +0000 UTC m=+45.782145843" Apr 23 16:35:35.610862 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.610830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-client-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611034 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.610895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zp6n\" (UniqueName: \"kubernetes.io/projected/e32dc24f-77a9-4c9d-a568-3e7866f08632-kube-api-access-4zp6n\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611034 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.610954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-client-certs\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611034 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.610987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-tls\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611034 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.611019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611254 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.611043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-metrics-server-audit-profiles\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611254 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.611104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e32dc24f-77a9-4c9d-a568-3e7866f08632-audit-log\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.611730 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.611495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e32dc24f-77a9-4c9d-a568-3e7866f08632-audit-log\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.612461 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.612371 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.613380 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.612543 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e32dc24f-77a9-4c9d-a568-3e7866f08632-metrics-server-audit-profiles\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.614749 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.614723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-client-certs\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.614900 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.614880 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-client-ca-bundle\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.615752 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.615719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e32dc24f-77a9-4c9d-a568-3e7866f08632-secret-metrics-server-tls\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.622693 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.622669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zp6n\" (UniqueName: \"kubernetes.io/projected/e32dc24f-77a9-4c9d-a568-3e7866f08632-kube-api-access-4zp6n\") pod \"metrics-server-7c7c78bc5d-5knsj\" (UID: \"e32dc24f-77a9-4c9d-a568-3e7866f08632\") " pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.793462 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.793023 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:35.983875 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.983845 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c7c78bc5d-5knsj"] Apr 23 16:35:35.983978 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:35.983896 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x"] Apr 23 16:35:35.995942 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:35.995904 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32dc24f_77a9_4c9d_a568_3e7866f08632.slice/crio-1a03b65806586fcbbb85f525deec5b59ffc027fb849823f613b189dbb9e2be5a WatchSource:0}: Error finding container 1a03b65806586fcbbb85f525deec5b59ffc027fb849823f613b189dbb9e2be5a: Status 404 returned error can't find the container with id 1a03b65806586fcbbb85f525deec5b59ffc027fb849823f613b189dbb9e2be5a Apr 23 16:35:36.023412 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.023373 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x"] Apr 23 16:35:36.023567 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.023544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:36.028238 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.028208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-x72mx\"" Apr 23 16:35:36.028376 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.028242 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 16:35:36.117032 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.116967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tvs8x\" (UID: \"241fa968-de71-4d41-bb5d-d2886ebb5366\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:36.218599 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.218535 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tvs8x\" (UID: \"241fa968-de71-4d41-bb5d-d2886ebb5366\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:36.218765 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:36.218733 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 16:35:36.218834 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:35:36.218800 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert podName:241fa968-de71-4d41-bb5d-d2886ebb5366 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:36.718781227 +0000 UTC m=+46.981567144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-tvs8x" (UID: "241fa968-de71-4d41-bb5d-d2886ebb5366") : secret "monitoring-plugin-cert" not found Apr 23 16:35:36.413096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.413012 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-547d785944-t6rr7"] Apr 23 16:35:36.435220 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.435179 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.437798 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.437770 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-547d785944-t6rr7"] Apr 23 16:35:36.438201 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.438186 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 16:35:36.439101 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.439081 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 16:35:36.439189 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.439176 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 16:35:36.439378 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.439363 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-4t5f7\"" Apr 23 16:35:36.441093 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.441067 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 16:35:36.441602 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.441558 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 16:35:36.450522 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.450499 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 16:35:36.488617 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.488364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pvhzm" event={"ID":"f8cb88e9-6f22-4927-807b-b213102a45ed","Type":"ContainerStarted","Data":"5de0a22b95fed2210f1cc35e02b06ac29fded948c3d6a687ebef2db44b3ac9de"} Apr 23 16:35:36.488617 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.488422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pvhzm" event={"ID":"f8cb88e9-6f22-4927-807b-b213102a45ed","Type":"ContainerStarted","Data":"949ed63b0ba60e48d710efd2344c0bb9e209434577cb290e504e1e37675d7f38"} Apr 23 16:35:36.490122 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.490092 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" event={"ID":"e32dc24f-77a9-4c9d-a568-3e7866f08632","Type":"ContainerStarted","Data":"1a03b65806586fcbbb85f525deec5b59ffc027fb849823f613b189dbb9e2be5a"} Apr 23 16:35:36.517871 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.517821 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pvhzm" podStartSLOduration=3.695615415 podStartE2EDuration="5.517807331s" podCreationTimestamp="2026-04-23 16:35:31 +0000 UTC" firstStartedPulling="2026-04-23 16:35:32.078409031 +0000 UTC m=+42.341194955" lastFinishedPulling="2026-04-23 16:35:33.900600958 +0000 UTC m=+44.163386871" observedRunningTime="2026-04-23 16:35:36.517616294 +0000 UTC m=+46.780402226" watchObservedRunningTime="2026-04-23 16:35:36.517807331 +0000 UTC m=+46.780593260" Apr 23 16:35:36.521322 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521290 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-metrics-client-ca\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521468 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521529 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521502 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521608 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521558 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxxj\" (UniqueName: \"kubernetes.io/projected/12e1e988-537f-4918-b283-86b348f0c63f-kube-api-access-4gxxj\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521661 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521616 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-federate-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521709 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521666 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521709 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521693 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.521805 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.521725 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-serving-certs-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622640 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-metrics-client-ca\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622813 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622689 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622813 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622747 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622813 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622784 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxxj\" (UniqueName: \"kubernetes.io/projected/12e1e988-537f-4918-b283-86b348f0c63f-kube-api-access-4gxxj\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622813 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622811 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-federate-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622969 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622969 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622855 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.622969 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.622884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-serving-certs-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.623675 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.623644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-serving-certs-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.624183 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.624165 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-metrics-client-ca\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.627885 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.627857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.629617 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.629593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.632285 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.632260 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-federate-client-tls\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.637120 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.637043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxxj\" (UniqueName: \"kubernetes.io/projected/12e1e988-537f-4918-b283-86b348f0c63f-kube-api-access-4gxxj\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.637426 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.637386 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12e1e988-537f-4918-b283-86b348f0c63f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.638322 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.638298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12e1e988-537f-4918-b283-86b348f0c63f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-547d785944-t6rr7\" (UID: \"12e1e988-537f-4918-b283-86b348f0c63f\") " pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.723389 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.723349 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tvs8x\" (UID: \"241fa968-de71-4d41-bb5d-d2886ebb5366\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:36.726513 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.726483 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/241fa968-de71-4d41-bb5d-d2886ebb5366-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tvs8x\" (UID: \"241fa968-de71-4d41-bb5d-d2886ebb5366\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:36.752277 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.752245 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" Apr 23 16:35:36.915412 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.915371 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-547d785944-t6rr7"] Apr 23 16:35:36.929674 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:36.929619 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e1e988_537f_4918_b283_86b348f0c63f.slice/crio-75240b74320dc5ebdf2ff4a88614988afb2dddb3e1cd4d8f649d96494bd581b6 WatchSource:0}: Error finding container 75240b74320dc5ebdf2ff4a88614988afb2dddb3e1cd4d8f649d96494bd581b6: Status 404 returned error can't find the container with id 75240b74320dc5ebdf2ff4a88614988afb2dddb3e1cd4d8f649d96494bd581b6 Apr 23 16:35:36.934724 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:36.934699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:37.096981 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:37.096935 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241fa968_de71_4d41_bb5d_d2886ebb5366.slice/crio-ec2fbc33ae46a912f78e10e710cba9d4d389fdd2daa90d695fef12213bff4d6c WatchSource:0}: Error finding container ec2fbc33ae46a912f78e10e710cba9d4d389fdd2daa90d695fef12213bff4d6c: Status 404 returned error can't find the container with id ec2fbc33ae46a912f78e10e710cba9d4d389fdd2daa90d695fef12213bff4d6c Apr 23 16:35:37.099721 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:37.099693 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x"] Apr 23 16:35:37.494891 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:37.494842 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" event={"ID":"12e1e988-537f-4918-b283-86b348f0c63f","Type":"ContainerStarted","Data":"75240b74320dc5ebdf2ff4a88614988afb2dddb3e1cd4d8f649d96494bd581b6"} Apr 23 16:35:37.496078 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:37.496052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" event={"ID":"241fa968-de71-4d41-bb5d-d2886ebb5366","Type":"ContainerStarted","Data":"ec2fbc33ae46a912f78e10e710cba9d4d389fdd2daa90d695fef12213bff4d6c"} Apr 23 16:35:39.504884 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:39.504837 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"c9bce6ae436ebee33a3cfc69a0ad3aea2d628563b7189f5500c58f9dab566bff"} Apr 23 16:35:39.505328 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:39.504892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"9b1f938f26b4429ac1bb86d209b735bc23a9342c0005ba9a6bf15660fe74bf9a"} Apr 23 16:35:39.505328 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:39.504908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"5e7ae9f4c359bc43a43d3ab68d0b335648ee01ef5956af0d38a68d71be295b39"} Apr 23 16:35:39.506766 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:39.506725 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" event={"ID":"e32dc24f-77a9-4c9d-a568-3e7866f08632","Type":"ContainerStarted","Data":"a7d10758602ecec7a8be9ebe029cc675d3a66a51887853f1ff5f81b18488ba68"} Apr 23 16:35:39.530396 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:39.530326 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" podStartSLOduration=1.670392288 podStartE2EDuration="4.530305935s" podCreationTimestamp="2026-04-23 16:35:35 +0000 UTC" firstStartedPulling="2026-04-23 16:35:35.999130498 +0000 UTC m=+46.261916409" lastFinishedPulling="2026-04-23 16:35:38.859044131 +0000 UTC m=+49.121830056" observedRunningTime="2026-04-23 16:35:39.528402713 +0000 UTC m=+49.791188648" watchObservedRunningTime="2026-04-23 16:35:39.530305935 +0000 UTC m=+49.793091870" Apr 23 16:35:41.514217 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.514175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" event={"ID":"241fa968-de71-4d41-bb5d-d2886ebb5366","Type":"ContainerStarted","Data":"1e33761cc1f03c3f0747dc429382c6f4724d6165ab6e2035db5ae61d4af726e4"} Apr 23 16:35:41.514693 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.514640 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:41.519544 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.518055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"240823bfe906a45e931815e8f1cac4e4a2a83ed11f5390320ab997dac91839c9"} Apr 23 16:35:41.519544 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.518096 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"d9e5ce8d533fe29b1df2f3afb503d68969097cc280aadbf73d0ecc26882fee1d"} Apr 23 16:35:41.519772 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.519648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" event={"ID":"12e1e988-537f-4918-b283-86b348f0c63f","Type":"ContainerStarted","Data":"0ea58dfb5aec2c9ffa7099a4dfd682d81d3ecb2f24186d77d69565bd2a6ec8d8"} Apr 23 16:35:41.521133 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.521111 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" Apr 23 16:35:41.536184 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:41.536123 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tvs8x" podStartSLOduration=2.627940852 podStartE2EDuration="6.536103635s" podCreationTimestamp="2026-04-23 16:35:35 +0000 UTC" firstStartedPulling="2026-04-23 16:35:37.100543941 +0000 UTC m=+47.363329853" lastFinishedPulling="2026-04-23 16:35:41.008706723 +0000 UTC m=+51.271492636" observedRunningTime="2026-04-23 16:35:41.534562988 +0000 UTC m=+51.797348935" watchObservedRunningTime="2026-04-23 16:35:41.536103635 +0000 UTC m=+51.798889569" Apr 23 16:35:42.469603 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:42.469551 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vsd8g" Apr 23 16:35:42.528508 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:42.528474 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" event={"ID":"7e9f2050-7154-45e4-8941-45715943a2f9","Type":"ContainerStarted","Data":"69aa0dbbf03c3db18a7f28eb421e1758d8f8246c68869728d0cbb35c7ad1394f"} Apr 23 16:35:42.528936 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:42.528539 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:42.563962 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:42.563899 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" podStartSLOduration=2.215915608 podStartE2EDuration="8.563876521s" podCreationTimestamp="2026-04-23 16:35:34 +0000 UTC" firstStartedPulling="2026-04-23 16:35:34.937487232 +0000 UTC m=+45.200273143" lastFinishedPulling="2026-04-23 16:35:41.285448141 +0000 UTC m=+51.548234056" observedRunningTime="2026-04-23 16:35:42.560929572 +0000 UTC m=+52.823715507" watchObservedRunningTime="2026-04-23 16:35:42.563876521 +0000 UTC m=+52.826662455" Apr 23 16:35:43.532819 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:43.532783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" event={"ID":"12e1e988-537f-4918-b283-86b348f0c63f","Type":"ContainerStarted","Data":"b9bfaacd81581329304dfc8b422ee64d995f3f27b3e39f6b4d559e1bd487b06a"} Apr 23 16:35:43.533299 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:43.532828 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" event={"ID":"12e1e988-537f-4918-b283-86b348f0c63f","Type":"ContainerStarted","Data":"bde9f6de68c2caefe042ab11aa1ae36ac49a55e825f608603d291d880c5b8081"} Apr 23 16:35:43.564229 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:43.563967 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-547d785944-t6rr7" podStartSLOduration=1.745153229 podStartE2EDuration="7.563946137s" podCreationTimestamp="2026-04-23 16:35:36 +0000 UTC" firstStartedPulling="2026-04-23 16:35:36.931998393 +0000 UTC m=+47.194784305" lastFinishedPulling="2026-04-23 16:35:42.750791295 +0000 UTC m=+53.013577213" observedRunningTime="2026-04-23 16:35:43.561875732 +0000 UTC m=+53.824661664" watchObservedRunningTime="2026-04-23 16:35:43.563946137 +0000 UTC m=+53.826732071" Apr 23 16:35:44.540563 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:44.540537 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-d749c5fc6-5sfqd" Apr 23 16:35:45.597908 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.597868 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:35:45.602769 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.602745 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.605517 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.605489 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:35:45.606610 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.606569 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:35:45.606845 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.606819 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:35:45.606954 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.606569 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:35:45.607019 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.606628 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:35:45.607019 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.606632 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gjn46\"" Apr 23 16:35:45.615774 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.615748 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:35:45.714875 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.714838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.715068 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.714992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwmr\" (UniqueName: \"kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.715068 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.715038 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.715165 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.715116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.715165 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.715146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.715240 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.715198 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816395 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816355 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816627 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816627 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816627 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816627 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwmr\" (UniqueName: \"kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.816838 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.816653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.817356 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.817320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.817480 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.817457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.818052 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.818027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.820043 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.820016 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.820406 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.820368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.830281 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.830256 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwmr\" (UniqueName: \"kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr\") pod \"console-5f6f574d7b-4xfcq\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:45.913606 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:45.913490 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:35:49.426305 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:49.426272 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xvn7t" Apr 23 16:35:51.846858 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:51.846803 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:35:51.848841 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:51.848805 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998fc808_8285_4e28_9013_f783163c059e.slice/crio-7208ca0816713f62376295fb0e51f22af81d1a5ef9b2b5e90e1856e11ac88f59 WatchSource:0}: Error finding container 7208ca0816713f62376295fb0e51f22af81d1a5ef9b2b5e90e1856e11ac88f59: Status 404 returned error can't find the container with id 7208ca0816713f62376295fb0e51f22af81d1a5ef9b2b5e90e1856e11ac88f59 Apr 23 16:35:52.567440 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:52.567401 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7zcwv" event={"ID":"e2859776-cf83-49ed-ac30-69c2ec6863e5","Type":"ContainerStarted","Data":"4239daf1fbb670d57375979527098041e5c7be3c7d62f70581351d4778189edf"} Apr 23 16:35:52.567853 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:52.567821 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:52.568647 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:52.568613 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f574d7b-4xfcq" event={"ID":"998fc808-8285-4e28-9013-f783163c059e","Type":"ContainerStarted","Data":"7208ca0816713f62376295fb0e51f22af81d1a5ef9b2b5e90e1856e11ac88f59"} Apr 23 16:35:52.581388 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:52.581363 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7zcwv" Apr 23 16:35:52.587640 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:52.586937 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7zcwv" podStartSLOduration=1.402727466 podStartE2EDuration="18.58692419s" podCreationTimestamp="2026-04-23 16:35:34 +0000 UTC" firstStartedPulling="2026-04-23 16:35:34.949589315 +0000 UTC m=+45.212375240" lastFinishedPulling="2026-04-23 16:35:52.133786051 +0000 UTC m=+62.396571964" observedRunningTime="2026-04-23 16:35:52.58554591 +0000 UTC m=+62.848331843" watchObservedRunningTime="2026-04-23 16:35:52.58692419 +0000 UTC m=+62.849710182" Apr 23 16:35:54.548423 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.547118 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:35:54.586133 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.585956 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:35:54.586616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.586340 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.595836 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.595319 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:35:54.704073 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704012 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx24b\" (UniqueName: \"kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704073 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704138 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.704334 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.704232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx24b\" (UniqueName: \"kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805492 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.805796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.806442 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.806258 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.807174 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.806547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.807174 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.806686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.809114 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.809032 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.812241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.812182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.813364 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.813314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.817409 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.817371 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx24b\" (UniqueName: \"kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b\") pod \"console-6b8968874f-jbt4z\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:54.900446 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:54.900402 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:35:55.405231 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.405201 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:35:55.408820 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:55.408767 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13e47b0_66c9_4673_8591_38a90b4e25bd.slice/crio-eaed68fe0a6542e5da180a7cc5891e021eae55818b4e0164fbbf13b18234a0e0 WatchSource:0}: Error finding container eaed68fe0a6542e5da180a7cc5891e021eae55818b4e0164fbbf13b18234a0e0: Status 404 returned error can't find the container with id eaed68fe0a6542e5da180a7cc5891e021eae55818b4e0164fbbf13b18234a0e0 Apr 23 16:35:55.580948 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.580863 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8968874f-jbt4z" event={"ID":"f13e47b0-66c9-4673-8591-38a90b4e25bd","Type":"ContainerStarted","Data":"eaed68fe0a6542e5da180a7cc5891e021eae55818b4e0164fbbf13b18234a0e0"} Apr 23 16:35:55.794140 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.794099 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:55.794343 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.794162 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:35:55.917564 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.917242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:55.920549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.920518 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:55.931027 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:55.930991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67b8cec4-f05e-4ef7-9456-915dfa5c7554-metrics-certs\") pod \"network-metrics-daemon-f889w\" (UID: \"67b8cec4-f05e-4ef7-9456-915dfa5c7554\") " pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:56.109337 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.109302 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vvm92\"" Apr 23 16:35:56.116895 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.116794 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f889w" Apr 23 16:35:56.118985 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.118946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:56.122173 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.122149 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:56.132746 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.132715 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:56.143182 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.143155 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcvz\" (UniqueName: \"kubernetes.io/projected/38d83fc0-30d4-48d7-8aee-f7afaa404c2e-kube-api-access-2fcvz\") pod \"network-check-target-7bn2z\" (UID: \"38d83fc0-30d4-48d7-8aee-f7afaa404c2e\") " pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:56.259941 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.259905 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f889w"] Apr 23 16:35:56.263421 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:56.263392 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b8cec4_f05e_4ef7_9456_915dfa5c7554.slice/crio-dd5c2e5021ea5cd31113b075a1d96044bea6949019beb48cfcfb6aa07905ecd6 WatchSource:0}: Error finding container dd5c2e5021ea5cd31113b075a1d96044bea6949019beb48cfcfb6aa07905ecd6: Status 404 returned error can't find the container with id dd5c2e5021ea5cd31113b075a1d96044bea6949019beb48cfcfb6aa07905ecd6 Apr 23 16:35:56.418466 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.418375 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4csk9\"" Apr 23 16:35:56.425978 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.425948 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:35:56.566108 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.566077 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7bn2z"] Apr 23 16:35:56.568756 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:35:56.568716 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d83fc0_30d4_48d7_8aee_f7afaa404c2e.slice/crio-ce17b8e3a879492772ddfc6849fef67633a02bfcfaa4d860517087079f31e5bf WatchSource:0}: Error finding container ce17b8e3a879492772ddfc6849fef67633a02bfcfaa4d860517087079f31e5bf: Status 404 returned error can't find the container with id ce17b8e3a879492772ddfc6849fef67633a02bfcfaa4d860517087079f31e5bf Apr 23 16:35:56.585659 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.585623 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8968874f-jbt4z" event={"ID":"f13e47b0-66c9-4673-8591-38a90b4e25bd","Type":"ContainerStarted","Data":"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a"} Apr 23 16:35:56.587494 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.587466 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f574d7b-4xfcq" event={"ID":"998fc808-8285-4e28-9013-f783163c059e","Type":"ContainerStarted","Data":"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc"} Apr 23 16:35:56.589051 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.589026 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7bn2z" event={"ID":"38d83fc0-30d4-48d7-8aee-f7afaa404c2e","Type":"ContainerStarted","Data":"ce17b8e3a879492772ddfc6849fef67633a02bfcfaa4d860517087079f31e5bf"} Apr 23 16:35:56.590636 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.590610 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f889w" event={"ID":"67b8cec4-f05e-4ef7-9456-915dfa5c7554","Type":"ContainerStarted","Data":"dd5c2e5021ea5cd31113b075a1d96044bea6949019beb48cfcfb6aa07905ecd6"} Apr 23 16:35:56.606325 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.606273 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b8968874f-jbt4z" podStartSLOduration=2.154395523 podStartE2EDuration="2.606256584s" podCreationTimestamp="2026-04-23 16:35:54 +0000 UTC" firstStartedPulling="2026-04-23 16:35:55.411274254 +0000 UTC m=+65.674060179" lastFinishedPulling="2026-04-23 16:35:55.863135321 +0000 UTC m=+66.125921240" observedRunningTime="2026-04-23 16:35:56.604550944 +0000 UTC m=+66.867336904" watchObservedRunningTime="2026-04-23 16:35:56.606256584 +0000 UTC m=+66.869042522" Apr 23 16:35:56.623972 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:56.623926 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6f574d7b-4xfcq" podStartSLOduration=7.823901647 podStartE2EDuration="11.623913783s" podCreationTimestamp="2026-04-23 16:35:45 +0000 UTC" firstStartedPulling="2026-04-23 16:35:51.850776398 +0000 UTC m=+62.113562323" lastFinishedPulling="2026-04-23 16:35:55.650788534 +0000 UTC m=+65.913574459" observedRunningTime="2026-04-23 16:35:56.623039985 +0000 UTC m=+66.885825918" watchObservedRunningTime="2026-04-23 16:35:56.623913783 +0000 UTC m=+66.886699717" Apr 23 16:35:58.601341 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.601257 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f889w" event={"ID":"67b8cec4-f05e-4ef7-9456-915dfa5c7554","Type":"ContainerStarted","Data":"20f341ea050fe844befd978dacd625411cd138ad43cbde75930990f89ce30442"} Apr 23 16:35:58.601341 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.601303 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f889w" event={"ID":"67b8cec4-f05e-4ef7-9456-915dfa5c7554","Type":"ContainerStarted","Data":"d6be574cdcb98768e55957f0b8756b65559a29544869061519a515dd90522eb0"} Apr 23 16:35:58.620019 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.619951 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f889w" podStartSLOduration=66.92897549 podStartE2EDuration="1m8.619929516s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:35:56.26581602 +0000 UTC m=+66.528601932" lastFinishedPulling="2026-04-23 16:35:57.956770044 +0000 UTC m=+68.219555958" observedRunningTime="2026-04-23 16:35:58.618752664 +0000 UTC m=+68.881538597" watchObservedRunningTime="2026-04-23 16:35:58.619929516 +0000 UTC m=+68.882715452" Apr 23 16:35:58.950971 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.950874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:58.953710 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.953672 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:58.964944 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:58.964879 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4b505881-5503-4e1f-b72b-0d8abde1a5e0-original-pull-secret\") pod \"global-pull-secret-syncer-7lg5k\" (UID: \"4b505881-5503-4e1f-b72b-0d8abde1a5e0\") " pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:35:59.097374 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:35:59.097336 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7lg5k" Apr 23 16:36:00.087557 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:00.087511 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7lg5k"] Apr 23 16:36:00.091851 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:36:00.091814 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b505881_5503_4e1f_b72b_0d8abde1a5e0.slice/crio-ad18035278bb5bec5d0ec12615c3229aa02c6d36851ae31f0d006c71bf0f0708 WatchSource:0}: Error finding container ad18035278bb5bec5d0ec12615c3229aa02c6d36851ae31f0d006c71bf0f0708: Status 404 returned error can't find the container with id ad18035278bb5bec5d0ec12615c3229aa02c6d36851ae31f0d006c71bf0f0708 Apr 23 16:36:00.618911 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:00.618869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7bn2z" event={"ID":"38d83fc0-30d4-48d7-8aee-f7afaa404c2e","Type":"ContainerStarted","Data":"dfa009ed4d0ff645e8f8fcdf103f3130f00ed3f07026b31160fec9ef88205764"} Apr 23 16:36:00.619093 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:00.618990 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:36:00.620481 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:00.620447 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7lg5k" event={"ID":"4b505881-5503-4e1f-b72b-0d8abde1a5e0","Type":"ContainerStarted","Data":"ad18035278bb5bec5d0ec12615c3229aa02c6d36851ae31f0d006c71bf0f0708"} Apr 23 16:36:00.638571 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:00.638512 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7bn2z" podStartSLOduration=66.788265443 podStartE2EDuration="1m10.638488939s" podCreationTimestamp="2026-04-23 16:34:50 +0000 UTC" firstStartedPulling="2026-04-23 16:35:56.571207498 +0000 UTC m=+66.833993408" lastFinishedPulling="2026-04-23 16:36:00.421430989 +0000 UTC m=+70.684216904" observedRunningTime="2026-04-23 16:36:00.637465343 +0000 UTC m=+70.900251279" watchObservedRunningTime="2026-04-23 16:36:00.638488939 +0000 UTC m=+70.901274873" Apr 23 16:36:04.900784 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:04.900743 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:04.900784 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:04.900796 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:04.905771 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:04.905743 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:05.641724 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:05.641681 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7lg5k" event={"ID":"4b505881-5503-4e1f-b72b-0d8abde1a5e0","Type":"ContainerStarted","Data":"a1a43c73243fc7d2c5fc7f9075f7660dea42db1fe86025badb5b09bdd2a861d6"} Apr 23 16:36:05.645663 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:05.645634 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:05.663800 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:05.663742 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7lg5k" podStartSLOduration=66.015757388 podStartE2EDuration="1m10.663723104s" podCreationTimestamp="2026-04-23 16:34:55 +0000 UTC" firstStartedPulling="2026-04-23 16:36:00.093824598 +0000 UTC m=+70.356610509" lastFinishedPulling="2026-04-23 16:36:04.741790311 +0000 UTC m=+75.004576225" observedRunningTime="2026-04-23 16:36:05.662338604 +0000 UTC m=+75.925124537" watchObservedRunningTime="2026-04-23 16:36:05.663723104 +0000 UTC m=+75.926509036" Apr 23 16:36:05.725809 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:05.725770 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:36:05.914477 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:05.914378 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:36:15.798860 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:15.798741 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:36:15.802750 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:15.802723 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c7c78bc5d-5knsj" Apr 23 16:36:22.258197 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.258165 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:36:22.267406 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.267372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.272553 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.272522 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:36:22.342613 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrgs\" (UniqueName: \"kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342635 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342664 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342707 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.342804 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.342804 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443361 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrgs\" (UniqueName: \"kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443546 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443546 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443415 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443546 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443432 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443546 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443546 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.443797 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.443619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.444328 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.444259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.444328 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.444278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.444655 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.444379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.444655 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.444452 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.446263 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.446231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.446263 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.446257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.453592 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.453550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrgs\" (UniqueName: \"kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs\") pod \"console-77956f5445-ks96n\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.578521 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.578412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:22.716629 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:22.716602 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:36:22.719046 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:36:22.719008 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8578f4_f872_4adc_ab1b_d3ce1f3623c9.slice/crio-0b5b7c57414fb3cecf9a2e7c7fa83947a8d4181a17aa2d0f7c16b1d427cfa5e7 WatchSource:0}: Error finding container 0b5b7c57414fb3cecf9a2e7c7fa83947a8d4181a17aa2d0f7c16b1d427cfa5e7: Status 404 returned error can't find the container with id 0b5b7c57414fb3cecf9a2e7c7fa83947a8d4181a17aa2d0f7c16b1d427cfa5e7 Apr 23 16:36:23.695486 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:23.695450 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77956f5445-ks96n" event={"ID":"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9","Type":"ContainerStarted","Data":"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af"} Apr 23 16:36:23.695486 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:23.695489 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77956f5445-ks96n" event={"ID":"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9","Type":"ContainerStarted","Data":"0b5b7c57414fb3cecf9a2e7c7fa83947a8d4181a17aa2d0f7c16b1d427cfa5e7"} Apr 23 16:36:23.717285 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:23.717223 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77956f5445-ks96n" podStartSLOduration=1.717191559 podStartE2EDuration="1.717191559s" podCreationTimestamp="2026-04-23 16:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:23.715108632 +0000 UTC m=+93.977894564" watchObservedRunningTime="2026-04-23 16:36:23.717191559 +0000 UTC m=+93.979977489" Apr 23 16:36:30.745932 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:30.745867 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f6f574d7b-4xfcq" podUID="998fc808-8285-4e28-9013-f783163c059e" containerName="console" containerID="cri-o://421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc" gracePeriod=15 Apr 23 16:36:31.021289 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.021257 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6f574d7b-4xfcq_998fc808-8285-4e28-9013-f783163c059e/console/0.log" Apr 23 16:36:31.021429 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.021332 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:36:31.120534 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120492 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.120534 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120535 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.120779 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120616 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.120834 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120807 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.120885 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120859 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwmr\" (UniqueName: \"kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.120950 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120887 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert\") pod \"998fc808-8285-4e28-9013-f783163c059e\" (UID: \"998fc808-8285-4e28-9013-f783163c059e\") " Apr 23 16:36:31.121002 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.120976 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca" (OuterVolumeSpecName: "service-ca") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:31.121198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.121164 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-service-ca\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.121302 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.121248 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config" (OuterVolumeSpecName: "console-config") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:31.121356 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.121306 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:31.123240 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.123208 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr" (OuterVolumeSpecName: "kube-api-access-bgwmr") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "kube-api-access-bgwmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:31.123395 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.123372 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:31.123466 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.123449 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "998fc808-8285-4e28-9013-f783163c059e" (UID: "998fc808-8285-4e28-9013-f783163c059e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:31.221668 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.221631 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.221668 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.221661 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-console-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.221668 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.221671 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgwmr\" (UniqueName: \"kubernetes.io/projected/998fc808-8285-4e28-9013-f783163c059e-kube-api-access-bgwmr\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.221668 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.221680 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/998fc808-8285-4e28-9013-f783163c059e-oauth-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.221927 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.221689 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/998fc808-8285-4e28-9013-f783163c059e-console-oauth-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:31.627476 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.627442 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7bn2z" Apr 23 16:36:31.720098 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720068 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6f574d7b-4xfcq_998fc808-8285-4e28-9013-f783163c059e/console/0.log" Apr 23 16:36:31.720257 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720121 2580 generic.go:358] "Generic (PLEG): container finished" podID="998fc808-8285-4e28-9013-f783163c059e" containerID="421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc" exitCode=2 Apr 23 16:36:31.720257 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720185 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f574d7b-4xfcq" Apr 23 16:36:31.720257 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720195 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f574d7b-4xfcq" event={"ID":"998fc808-8285-4e28-9013-f783163c059e","Type":"ContainerDied","Data":"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc"} Apr 23 16:36:31.720257 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720230 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f574d7b-4xfcq" event={"ID":"998fc808-8285-4e28-9013-f783163c059e","Type":"ContainerDied","Data":"7208ca0816713f62376295fb0e51f22af81d1a5ef9b2b5e90e1856e11ac88f59"} Apr 23 16:36:31.720257 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.720247 2580 scope.go:117] "RemoveContainer" containerID="421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc" Apr 23 16:36:31.729215 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.729197 2580 scope.go:117] "RemoveContainer" containerID="421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc" Apr 23 16:36:31.729529 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:36:31.729508 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc\": container with ID starting with 421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc not found: ID does not exist" containerID="421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc" Apr 23 16:36:31.729616 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.729538 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc"} err="failed to get container status \"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc\": rpc error: code = NotFound desc = could not find container \"421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc\": container with ID starting with 421fceebbc03e594ba8f6c4aa7ce4a85450e1341fd64a828d0138efd7bc87dbc not found: ID does not exist" Apr 23 16:36:31.742897 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.742864 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:36:31.746017 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:31.745990 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f6f574d7b-4xfcq"] Apr 23 16:36:32.281592 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.281543 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998fc808-8285-4e28-9013-f783163c059e" path="/var/lib/kubelet/pods/998fc808-8285-4e28-9013-f783163c059e/volumes" Apr 23 16:36:32.579240 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.579148 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:32.579240 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.579231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:32.584100 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.584075 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:32.728104 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.728077 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:36:32.786807 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:32.786774 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:36:57.811976 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:57.811907 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b8968874f-jbt4z" podUID="f13e47b0-66c9-4673-8591-38a90b4e25bd" containerName="console" containerID="cri-o://599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a" gracePeriod=15 Apr 23 16:36:58.050731 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.050702 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8968874f-jbt4z_f13e47b0-66c9-4673-8591-38a90b4e25bd/console/0.log" Apr 23 16:36:58.050870 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.050772 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:58.155771 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155669 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.155771 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx24b\" (UniqueName: \"kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.155771 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155756 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.156046 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155800 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.156046 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155852 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.156046 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155888 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.156046 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.155947 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config\") pod \"f13e47b0-66c9-4673-8591-38a90b4e25bd\" (UID: \"f13e47b0-66c9-4673-8591-38a90b4e25bd\") " Apr 23 16:36:58.156373 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.156334 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.156373 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.156365 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.156570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.156490 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca" (OuterVolumeSpecName: "service-ca") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.156570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.156506 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config" (OuterVolumeSpecName: "console-config") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:58.158424 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.158402 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:58.158723 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.158691 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:58.158813 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.158750 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b" (OuterVolumeSpecName: "kube-api-access-hx24b") pod "f13e47b0-66c9-4673-8591-38a90b4e25bd" (UID: "f13e47b0-66c9-4673-8591-38a90b4e25bd"). InnerVolumeSpecName "kube-api-access-hx24b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:58.257198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257154 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-service-ca\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257188 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-oauth-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257198 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257207 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hx24b\" (UniqueName: \"kubernetes.io/projected/f13e47b0-66c9-4673-8591-38a90b4e25bd-kube-api-access-hx24b\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257464 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257218 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-oauth-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257464 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257228 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-trusted-ca-bundle\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.257464 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.257236 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f13e47b0-66c9-4673-8591-38a90b4e25bd-console-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:36:58.799211 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799182 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8968874f-jbt4z_f13e47b0-66c9-4673-8591-38a90b4e25bd/console/0.log" Apr 23 16:36:58.799368 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799226 2580 generic.go:358] "Generic (PLEG): container finished" podID="f13e47b0-66c9-4673-8591-38a90b4e25bd" containerID="599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a" exitCode=2 Apr 23 16:36:58.799368 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799300 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8968874f-jbt4z" Apr 23 16:36:58.799368 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799311 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8968874f-jbt4z" event={"ID":"f13e47b0-66c9-4673-8591-38a90b4e25bd","Type":"ContainerDied","Data":"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a"} Apr 23 16:36:58.799368 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799346 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8968874f-jbt4z" event={"ID":"f13e47b0-66c9-4673-8591-38a90b4e25bd","Type":"ContainerDied","Data":"eaed68fe0a6542e5da180a7cc5891e021eae55818b4e0164fbbf13b18234a0e0"} Apr 23 16:36:58.799368 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.799361 2580 scope.go:117] "RemoveContainer" containerID="599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a" Apr 23 16:36:58.807227 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.807199 2580 scope.go:117] "RemoveContainer" containerID="599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a" Apr 23 16:36:58.807468 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:36:58.807450 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a\": container with ID starting with 599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a not found: ID does not exist" containerID="599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a" Apr 23 16:36:58.807513 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.807483 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a"} err="failed to get container status \"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a\": rpc error: code = NotFound desc = could not find container \"599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a\": container with ID starting with 599b215ffb4f14a593a04ad26c56b323e1a4b059eefb7e67806c4f0f8325027a not found: ID does not exist" Apr 23 16:36:58.822241 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.822213 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:36:58.829719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:36:58.829695 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b8968874f-jbt4z"] Apr 23 16:37:00.281427 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:00.281390 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13e47b0-66c9-4673-8591-38a90b4e25bd" path="/var/lib/kubelet/pods/f13e47b0-66c9-4673-8591-38a90b4e25bd/volumes" Apr 23 16:37:14.133118 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133085 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84"] Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133387 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="998fc808-8285-4e28-9013-f783163c059e" containerName="console" Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133397 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc808-8285-4e28-9013-f783163c059e" containerName="console" Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133409 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13e47b0-66c9-4673-8591-38a90b4e25bd" containerName="console" Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133415 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13e47b0-66c9-4673-8591-38a90b4e25bd" containerName="console" Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133459 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="998fc808-8285-4e28-9013-f783163c059e" containerName="console" Apr 23 16:37:14.133600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.133470 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f13e47b0-66c9-4673-8591-38a90b4e25bd" containerName="console" Apr 23 16:37:14.136436 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.136417 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.141061 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.141039 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:37:14.141142 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.141052 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rqftk\"" Apr 23 16:37:14.141743 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.141713 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:37:14.152754 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.152723 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84"] Apr 23 16:37:14.182778 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.182736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.182778 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.182782 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gjc\" (UniqueName: \"kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.182992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.182823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.283289 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.283248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.283289 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.283288 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94gjc\" (UniqueName: \"kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.283497 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.283328 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.283736 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.283715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.283779 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.283727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.292721 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.292690 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gjc\" (UniqueName: \"kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.445973 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.445858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:14.575502 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.575400 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84"] Apr 23 16:37:14.844530 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:14.844485 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" event={"ID":"57117bc9-d636-463f-9e3b-d21d2561c8db","Type":"ContainerStarted","Data":"21a7dc65856068fe38151b086b510aed81fe5998c1e5ee4d8426e23f615c113e"} Apr 23 16:37:22.869113 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:22.869073 2580 generic.go:358] "Generic (PLEG): container finished" podID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerID="8add931f07885c95bc14ba26eb14d240c502d8a30bef6485a6960b86dbdbac8b" exitCode=0 Apr 23 16:37:22.869537 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:22.869129 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" event={"ID":"57117bc9-d636-463f-9e3b-d21d2561c8db","Type":"ContainerDied","Data":"8add931f07885c95bc14ba26eb14d240c502d8a30bef6485a6960b86dbdbac8b"} Apr 23 16:37:29.891432 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:29.891393 2580 generic.go:358] "Generic (PLEG): container finished" podID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerID="1c3707eacc9cd642ee11a0eb4bb58aa39886fd2d2825716a4684bd529ac5f3d3" exitCode=0 Apr 23 16:37:29.891826 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:29.891469 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" event={"ID":"57117bc9-d636-463f-9e3b-d21d2561c8db","Type":"ContainerDied","Data":"1c3707eacc9cd642ee11a0eb4bb58aa39886fd2d2825716a4684bd529ac5f3d3"} Apr 23 16:37:37.916952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:37.916854 2580 generic.go:358] "Generic (PLEG): container finished" podID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerID="0b099cdeac3b2baf77e37ebf7388046b074d89d87e39dd7ff9e275fd74ed4302" exitCode=0 Apr 23 16:37:37.916952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:37.916932 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" event={"ID":"57117bc9-d636-463f-9e3b-d21d2561c8db","Type":"ContainerDied","Data":"0b099cdeac3b2baf77e37ebf7388046b074d89d87e39dd7ff9e275fd74ed4302"} Apr 23 16:37:39.048654 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.048628 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:39.193144 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.193046 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94gjc\" (UniqueName: \"kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc\") pod \"57117bc9-d636-463f-9e3b-d21d2561c8db\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " Apr 23 16:37:39.193144 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.193087 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle\") pod \"57117bc9-d636-463f-9e3b-d21d2561c8db\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " Apr 23 16:37:39.193144 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.193145 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util\") pod \"57117bc9-d636-463f-9e3b-d21d2561c8db\" (UID: \"57117bc9-d636-463f-9e3b-d21d2561c8db\") " Apr 23 16:37:39.193747 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.193709 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle" (OuterVolumeSpecName: "bundle") pod "57117bc9-d636-463f-9e3b-d21d2561c8db" (UID: "57117bc9-d636-463f-9e3b-d21d2561c8db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:39.195450 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.195421 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc" (OuterVolumeSpecName: "kube-api-access-94gjc") pod "57117bc9-d636-463f-9e3b-d21d2561c8db" (UID: "57117bc9-d636-463f-9e3b-d21d2561c8db"). InnerVolumeSpecName "kube-api-access-94gjc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:39.198119 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.198094 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util" (OuterVolumeSpecName: "util") pod "57117bc9-d636-463f-9e3b-d21d2561c8db" (UID: "57117bc9-d636-463f-9e3b-d21d2561c8db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:39.294309 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.294267 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-util\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:37:39.294309 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.294299 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94gjc\" (UniqueName: \"kubernetes.io/projected/57117bc9-d636-463f-9e3b-d21d2561c8db-kube-api-access-94gjc\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:37:39.294309 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.294311 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57117bc9-d636-463f-9e3b-d21d2561c8db-bundle\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:37:39.925119 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.925077 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" event={"ID":"57117bc9-d636-463f-9e3b-d21d2561c8db","Type":"ContainerDied","Data":"21a7dc65856068fe38151b086b510aed81fe5998c1e5ee4d8426e23f615c113e"} Apr 23 16:37:39.925119 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.925114 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21a7dc65856068fe38151b086b510aed81fe5998c1e5ee4d8426e23f615c113e" Apr 23 16:37:39.925336 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:39.925141 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c95h84" Apr 23 16:37:47.000296 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000254 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4"] Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000597 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="pull" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000609 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="pull" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000623 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="util" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000629 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="util" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000638 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="extract" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000644 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="extract" Apr 23 16:37:47.000851 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.000698 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="57117bc9-d636-463f-9e3b-d21d2561c8db" containerName="extract" Apr 23 16:37:47.005163 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.005136 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.008934 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.008907 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 16:37:47.009099 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.009080 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gzcjs\"" Apr 23 16:37:47.009525 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.009503 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 16:37:47.009635 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.009612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 16:37:47.021722 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.021694 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4"] Apr 23 16:37:47.160928 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.160895 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bef0cdb3-1ef4-4472-b857-b8f84ce41921-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.160928 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.160944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84png\" (UniqueName: \"kubernetes.io/projected/bef0cdb3-1ef4-4472-b857-b8f84ce41921-kube-api-access-84png\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.262319 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.262202 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bef0cdb3-1ef4-4472-b857-b8f84ce41921-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.262319 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.262270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84png\" (UniqueName: \"kubernetes.io/projected/bef0cdb3-1ef4-4472-b857-b8f84ce41921-kube-api-access-84png\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.264865 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.264837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bef0cdb3-1ef4-4472-b857-b8f84ce41921-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.272064 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.272037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84png\" (UniqueName: \"kubernetes.io/projected/bef0cdb3-1ef4-4472-b857-b8f84ce41921-kube-api-access-84png\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4\" (UID: \"bef0cdb3-1ef4-4472-b857-b8f84ce41921\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.317037 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.316994 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:47.452863 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.452823 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4"] Apr 23 16:37:47.456780 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:37:47.456749 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef0cdb3_1ef4_4472_b857_b8f84ce41921.slice/crio-a0ec0040cc1b09b2a7b9cddff98b4282120dadc80b7d931fcbda0711bdfece98 WatchSource:0}: Error finding container a0ec0040cc1b09b2a7b9cddff98b4282120dadc80b7d931fcbda0711bdfece98: Status 404 returned error can't find the container with id a0ec0040cc1b09b2a7b9cddff98b4282120dadc80b7d931fcbda0711bdfece98 Apr 23 16:37:47.948520 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:47.948482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" event={"ID":"bef0cdb3-1ef4-4472-b857-b8f84ce41921","Type":"ContainerStarted","Data":"a0ec0040cc1b09b2a7b9cddff98b4282120dadc80b7d931fcbda0711bdfece98"} Apr 23 16:37:51.964631 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.964571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" event={"ID":"bef0cdb3-1ef4-4472-b857-b8f84ce41921","Type":"ContainerStarted","Data":"072c1eb66e6ebe3d04dce6c0c25bfa67797ae3c4469fdd5496a39abf97cf360a"} Apr 23 16:37:51.965044 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.964641 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:37:51.978688 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.978657 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-tsgvj"] Apr 23 16:37:51.981799 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.981781 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:51.984267 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.984234 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 16:37:51.984267 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.984263 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 16:37:51.984447 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.984274 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-9rzp9\"" Apr 23 16:37:51.991629 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.991601 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-tsgvj"] Apr 23 16:37:51.995744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:51.995691 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" podStartSLOduration=2.153626218 podStartE2EDuration="5.995674284s" podCreationTimestamp="2026-04-23 16:37:46 +0000 UTC" firstStartedPulling="2026-04-23 16:37:47.458668252 +0000 UTC m=+177.721454163" lastFinishedPulling="2026-04-23 16:37:51.300716317 +0000 UTC m=+181.563502229" observedRunningTime="2026-04-23 16:37:51.993558241 +0000 UTC m=+182.256344188" watchObservedRunningTime="2026-04-23 16:37:51.995674284 +0000 UTC m=+182.258460219" Apr 23 16:37:52.105665 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.105628 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72tl\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-kube-api-access-r72tl\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.105857 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.105685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.105908 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.105876 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-cabundle0\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.206871 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.206821 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.207056 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.206886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-cabundle0\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.207056 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.206929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r72tl\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-kube-api-access-r72tl\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.207056 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.206993 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:37:52.207056 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.207019 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:37:52.207056 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.207031 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-tsgvj: references non-existent secret key: ca.crt Apr 23 16:37:52.207305 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.207095 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates podName:6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe nodeName:}" failed. No retries permitted until 2026-04-23 16:37:52.7070757 +0000 UTC m=+182.969861629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates") pod "keda-operator-ffbb595cb-tsgvj" (UID: "6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe") : references non-existent secret key: ca.crt Apr 23 16:37:52.207636 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.207613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-cabundle0\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.219022 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.218955 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72tl\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-kube-api-access-r72tl\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.252890 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.252855 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq"] Apr 23 16:37:52.256278 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.256245 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.258883 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.258856 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 16:37:52.264972 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.264936 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq"] Apr 23 16:37:52.408952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.408904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/98320b55-882c-49f4-afdc-bab1e27ea968-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.408952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.408951 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.409170 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.409104 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcv69\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-kube-api-access-fcv69\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.510088 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.509992 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcv69\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-kube-api-access-fcv69\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.510088 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.510049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/98320b55-882c-49f4-afdc-bab1e27ea968-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.510088 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.510067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.510308 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.510161 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:37:52.510308 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.510172 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:37:52.510308 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.510187 2580 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 23 16:37:52.510308 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.510205 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 23 16:37:52.510308 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.510258 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates podName:98320b55-882c-49f4-afdc-bab1e27ea968 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:53.010244272 +0000 UTC m=+183.273030183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates") pod "keda-metrics-apiserver-7c9f485588-88vnq" (UID: "98320b55-882c-49f4-afdc-bab1e27ea968") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 23 16:37:52.510471 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.510451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/98320b55-882c-49f4-afdc-bab1e27ea968-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.521799 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.521762 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcv69\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-kube-api-access-fcv69\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:52.609768 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.609735 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-b7rgr"] Apr 23 16:37:52.613058 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.613025 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:52.615631 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.615601 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 16:37:52.624485 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.624457 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-b7rgr"] Apr 23 16:37:52.712301 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.712262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh67b\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-kube-api-access-hh67b\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:52.712490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.712332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:52.712490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.712358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:52.712490 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.712482 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:37:52.712642 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.712501 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:37:52.712642 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.712510 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-tsgvj: references non-existent secret key: ca.crt Apr 23 16:37:52.712642 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.712560 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates podName:6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe nodeName:}" failed. No retries permitted until 2026-04-23 16:37:53.712546256 +0000 UTC m=+183.975332167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates") pod "keda-operator-ffbb595cb-tsgvj" (UID: "6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe") : references non-existent secret key: ca.crt Apr 23 16:37:52.813423 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.813321 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh67b\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-kube-api-access-hh67b\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:52.813423 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.813412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:52.813655 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.813516 2580 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 23 16:37:52.813655 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.813541 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-b7rgr: secret "keda-admission-webhooks-certs" not found Apr 23 16:37:52.813655 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:52.813623 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates podName:a7cc0950-4b68-4f76-aa8c-12168615ca92 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:53.313603214 +0000 UTC m=+183.576389125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates") pod "keda-admission-cf49989db-b7rgr" (UID: "a7cc0950-4b68-4f76-aa8c-12168615ca92") : secret "keda-admission-webhooks-certs" not found Apr 23 16:37:52.824007 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:52.823970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh67b\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-kube-api-access-hh67b\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:53.015004 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.014970 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:53.015351 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.015101 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:37:53.015351 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.015117 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:37:53.015351 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.015136 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq: references non-existent secret key: tls.crt Apr 23 16:37:53.015351 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.015197 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates podName:98320b55-882c-49f4-afdc-bab1e27ea968 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:54.015181451 +0000 UTC m=+184.277967361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates") pod "keda-metrics-apiserver-7c9f485588-88vnq" (UID: "98320b55-882c-49f4-afdc-bab1e27ea968") : references non-existent secret key: tls.crt Apr 23 16:37:53.317683 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.317640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:53.320203 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.320179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a7cc0950-4b68-4f76-aa8c-12168615ca92-certificates\") pod \"keda-admission-cf49989db-b7rgr\" (UID: \"a7cc0950-4b68-4f76-aa8c-12168615ca92\") " pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:53.524283 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.524240 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:53.658325 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.658299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-b7rgr"] Apr 23 16:37:53.661329 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:37:53.661292 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cc0950_4b68_4f76_aa8c_12168615ca92.slice/crio-85d82549467f1345db0b5531084630786646290524ebdfb4a984a27a9d884d73 WatchSource:0}: Error finding container 85d82549467f1345db0b5531084630786646290524ebdfb4a984a27a9d884d73: Status 404 returned error can't find the container with id 85d82549467f1345db0b5531084630786646290524ebdfb4a984a27a9d884d73 Apr 23 16:37:53.721309 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.721272 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:53.721486 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.721438 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:37:53.721486 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.721457 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:37:53.721486 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.721466 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-tsgvj: references non-existent secret key: ca.crt Apr 23 16:37:53.721641 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:53.721533 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates podName:6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe nodeName:}" failed. No retries permitted until 2026-04-23 16:37:55.721517014 +0000 UTC m=+185.984302925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates") pod "keda-operator-ffbb595cb-tsgvj" (UID: "6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe") : references non-existent secret key: ca.crt Apr 23 16:37:53.972770 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:53.972732 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-b7rgr" event={"ID":"a7cc0950-4b68-4f76-aa8c-12168615ca92","Type":"ContainerStarted","Data":"85d82549467f1345db0b5531084630786646290524ebdfb4a984a27a9d884d73"} Apr 23 16:37:54.024677 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:54.024638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:54.025061 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:54.024756 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:37:54.025061 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:54.024769 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:37:54.025061 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:54.024786 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq: references non-existent secret key: tls.crt Apr 23 16:37:54.025061 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:54.024837 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates podName:98320b55-882c-49f4-afdc-bab1e27ea968 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.024821795 +0000 UTC m=+186.287607705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates") pod "keda-metrics-apiserver-7c9f485588-88vnq" (UID: "98320b55-882c-49f4-afdc-bab1e27ea968") : references non-existent secret key: tls.crt Apr 23 16:37:55.740100 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:55.740051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:55.740545 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:55.740180 2580 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:37:55.740545 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:55.740199 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:37:55.740545 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:55.740209 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-tsgvj: references non-existent secret key: ca.crt Apr 23 16:37:55.740545 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:55.740271 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates podName:6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe nodeName:}" failed. No retries permitted until 2026-04-23 16:37:59.740253902 +0000 UTC m=+190.003039813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates") pod "keda-operator-ffbb595cb-tsgvj" (UID: "6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe") : references non-existent secret key: ca.crt Apr 23 16:37:56.042944 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:56.042848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:37:56.043096 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:56.042967 2580 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:37:56.043096 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:56.043002 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:37:56.043096 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:56.043024 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq: references non-existent secret key: tls.crt Apr 23 16:37:56.043096 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:37:56.043095 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates podName:98320b55-882c-49f4-afdc-bab1e27ea968 nodeName:}" failed. No retries permitted until 2026-04-23 16:38:00.043074376 +0000 UTC m=+190.305860291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates") pod "keda-metrics-apiserver-7c9f485588-88vnq" (UID: "98320b55-882c-49f4-afdc-bab1e27ea968") : references non-existent secret key: tls.crt Apr 23 16:37:56.984134 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:56.984095 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-b7rgr" event={"ID":"a7cc0950-4b68-4f76-aa8c-12168615ca92","Type":"ContainerStarted","Data":"789aa825bc66a4951aa7eb025f790980852e3fbdcaaf6d9c277df8d5e63c92e4"} Apr 23 16:37:56.984644 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:56.984256 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:37:57.014415 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:57.012308 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-b7rgr" podStartSLOduration=2.437600658 podStartE2EDuration="5.01228712s" podCreationTimestamp="2026-04-23 16:37:52 +0000 UTC" firstStartedPulling="2026-04-23 16:37:53.662789992 +0000 UTC m=+183.925575903" lastFinishedPulling="2026-04-23 16:37:56.237476441 +0000 UTC m=+186.500262365" observedRunningTime="2026-04-23 16:37:57.011099523 +0000 UTC m=+187.273885457" watchObservedRunningTime="2026-04-23 16:37:57.01228712 +0000 UTC m=+187.275073054" Apr 23 16:37:59.779140 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:59.779098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:59.781753 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:59.781723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe-certificates\") pod \"keda-operator-ffbb595cb-tsgvj\" (UID: \"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe\") " pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:59.792592 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:59.792553 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:37:59.922086 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:59.922034 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-tsgvj"] Apr 23 16:37:59.924459 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:37:59.924414 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfbe6a0_086e_4c9c_a335_1ae0ada65cbe.slice/crio-7a24fe5381ddeca04d235e4feb51dfa985ee515dc5531a68ea2f6011326ceffa WatchSource:0}: Error finding container 7a24fe5381ddeca04d235e4feb51dfa985ee515dc5531a68ea2f6011326ceffa: Status 404 returned error can't find the container with id 7a24fe5381ddeca04d235e4feb51dfa985ee515dc5531a68ea2f6011326ceffa Apr 23 16:37:59.994680 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:37:59.994635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" event={"ID":"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe","Type":"ContainerStarted","Data":"7a24fe5381ddeca04d235e4feb51dfa985ee515dc5531a68ea2f6011326ceffa"} Apr 23 16:38:00.082479 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:00.082399 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:38:00.085087 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:00.085062 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/98320b55-882c-49f4-afdc-bab1e27ea968-certificates\") pod \"keda-metrics-apiserver-7c9f485588-88vnq\" (UID: \"98320b55-882c-49f4-afdc-bab1e27ea968\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:38:00.368876 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:00.368777 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:38:00.503597 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:00.503553 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq"] Apr 23 16:38:00.505481 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:38:00.505453 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98320b55_882c_49f4_afdc_bab1e27ea968.slice/crio-b71d1a36b94abcfd5b3f7f5d12491c273f91226f3f7168c489e11b353564e296 WatchSource:0}: Error finding container b71d1a36b94abcfd5b3f7f5d12491c273f91226f3f7168c489e11b353564e296: Status 404 returned error can't find the container with id b71d1a36b94abcfd5b3f7f5d12491c273f91226f3f7168c489e11b353564e296 Apr 23 16:38:01.000106 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:01.000067 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" event={"ID":"98320b55-882c-49f4-afdc-bab1e27ea968","Type":"ContainerStarted","Data":"b71d1a36b94abcfd5b3f7f5d12491c273f91226f3f7168c489e11b353564e296"} Apr 23 16:38:05.014994 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.014958 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" event={"ID":"6cfbe6a0-086e-4c9c-a335-1ae0ada65cbe","Type":"ContainerStarted","Data":"e3c363c66eb55f0741597fb381e92b6ddfe5324f4a2cf5953ff4d52a3e72b932"} Apr 23 16:38:05.015441 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.015180 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:38:05.016419 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.016395 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" event={"ID":"98320b55-882c-49f4-afdc-bab1e27ea968","Type":"ContainerStarted","Data":"002950f6b90f221c56c6a391f2aeda23618327bc9d04e65506fa47c7453365e2"} Apr 23 16:38:05.016554 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.016538 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:38:05.036514 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.036462 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" podStartSLOduration=9.951608571 podStartE2EDuration="14.036448409s" podCreationTimestamp="2026-04-23 16:37:51 +0000 UTC" firstStartedPulling="2026-04-23 16:37:59.925713746 +0000 UTC m=+190.188499657" lastFinishedPulling="2026-04-23 16:38:04.010553585 +0000 UTC m=+194.273339495" observedRunningTime="2026-04-23 16:38:05.034136091 +0000 UTC m=+195.296922025" watchObservedRunningTime="2026-04-23 16:38:05.036448409 +0000 UTC m=+195.299234342" Apr 23 16:38:05.067633 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:05.067554 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" podStartSLOduration=9.562140778 podStartE2EDuration="13.067533323s" podCreationTimestamp="2026-04-23 16:37:52 +0000 UTC" firstStartedPulling="2026-04-23 16:38:00.506787718 +0000 UTC m=+190.769573628" lastFinishedPulling="2026-04-23 16:38:04.012180257 +0000 UTC m=+194.274966173" observedRunningTime="2026-04-23 16:38:05.066525133 +0000 UTC m=+195.329311066" watchObservedRunningTime="2026-04-23 16:38:05.067533323 +0000 UTC m=+195.330319257" Apr 23 16:38:12.970847 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:12.970814 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vkrj4" Apr 23 16:38:16.024624 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:16.024566 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-88vnq" Apr 23 16:38:17.990131 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:17.990100 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-b7rgr" Apr 23 16:38:26.022323 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:26.022285 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-tsgvj" Apr 23 16:38:59.626436 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.626400 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:38:59.628818 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.628800 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.632332 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.632306 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-trpjj\"" Apr 23 16:38:59.632490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.632365 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 16:38:59.632490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.632413 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:38:59.632490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.632451 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:38:59.641555 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.641533 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-6f9zw"] Apr 23 16:38:59.643749 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.643730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.644147 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.644129 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:38:59.646369 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.646345 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-sv97f\"" Apr 23 16:38:59.646513 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.646495 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 16:38:59.658838 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.658809 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-6f9zw"] Apr 23 16:38:59.667982 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.667955 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dtdh7"] Apr 23 16:38:59.669933 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.669905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vbd\" (UniqueName: \"kubernetes.io/projected/d305ef47-e072-458a-b817-c133935e2157-kube-api-access-t7vbd\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.670067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.669942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zngl\" (UniqueName: \"kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.670067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.669970 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d305ef47-e072-458a-b817-c133935e2157-cert\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.670067 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.670011 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.670510 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.670489 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.673146 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.673123 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:38:59.674364 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.674341 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ckzvv\"" Apr 23 16:38:59.685287 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.685261 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dtdh7"] Apr 23 16:38:59.770871 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.770829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8679184f-be15-4071-9ec1-92da472ff0d4-data\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.771079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.770893 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vbd\" (UniqueName: \"kubernetes.io/projected/d305ef47-e072-458a-b817-c133935e2157-kube-api-access-t7vbd\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.771079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.770930 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zngl\" (UniqueName: \"kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.771079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.770957 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d305ef47-e072-458a-b817-c133935e2157-cert\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.771079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.770979 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.771079 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.771013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgbp\" (UniqueName: \"kubernetes.io/projected/8679184f-be15-4071-9ec1-92da472ff0d4-kube-api-access-mvgbp\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.773609 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.773560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.773730 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.773660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d305ef47-e072-458a-b817-c133935e2157-cert\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.787775 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.787746 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vbd\" (UniqueName: \"kubernetes.io/projected/d305ef47-e072-458a-b817-c133935e2157-kube-api-access-t7vbd\") pod \"llmisvc-controller-manager-6b94ff949c-6f9zw\" (UID: \"d305ef47-e072-458a-b817-c133935e2157\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.787877 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.787817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zngl\" (UniqueName: \"kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl\") pod \"kserve-controller-manager-5b898d7b9d-cvm6t\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.872258 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.872221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgbp\" (UniqueName: \"kubernetes.io/projected/8679184f-be15-4071-9ec1-92da472ff0d4-kube-api-access-mvgbp\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.872468 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.872301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8679184f-be15-4071-9ec1-92da472ff0d4-data\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.872678 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.872661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8679184f-be15-4071-9ec1-92da472ff0d4-data\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.881360 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.881294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgbp\" (UniqueName: \"kubernetes.io/projected/8679184f-be15-4071-9ec1-92da472ff0d4-kube-api-access-mvgbp\") pod \"seaweedfs-86cc847c5c-dtdh7\" (UID: \"8679184f-be15-4071-9ec1-92da472ff0d4\") " pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:38:59.938865 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.938835 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:38:59.953699 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.953666 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:38:59.984803 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:38:59.984727 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:39:00.103182 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.103134 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:39:00.105718 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:39:00.105653 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec6c55dc_57f9_42dc_bd87_3cfa96e05185.slice/crio-3e1b5ab173cba132c373e8d34c5b71cbe0eef69408af0188137ed7fba49c91ed WatchSource:0}: Error finding container 3e1b5ab173cba132c373e8d34c5b71cbe0eef69408af0188137ed7fba49c91ed: Status 404 returned error can't find the container with id 3e1b5ab173cba132c373e8d34c5b71cbe0eef69408af0188137ed7fba49c91ed Apr 23 16:39:00.128753 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.128726 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-6f9zw"] Apr 23 16:39:00.130406 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:39:00.130373 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd305ef47_e072_458a_b817_c133935e2157.slice/crio-1f1a0c46973c9dc4ec3baf0d80988358b02b6c2a1978fadac4526a3e8083200b WatchSource:0}: Error finding container 1f1a0c46973c9dc4ec3baf0d80988358b02b6c2a1978fadac4526a3e8083200b: Status 404 returned error can't find the container with id 1f1a0c46973c9dc4ec3baf0d80988358b02b6c2a1978fadac4526a3e8083200b Apr 23 16:39:00.148195 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.148168 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dtdh7"] Apr 23 16:39:00.150896 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:39:00.150873 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8679184f_be15_4071_9ec1_92da472ff0d4.slice/crio-41b03f8fb3fe1c9f4bda8b479c7340958f271d599e4b1f5659006e3a7635a580 WatchSource:0}: Error finding container 41b03f8fb3fe1c9f4bda8b479c7340958f271d599e4b1f5659006e3a7635a580: Status 404 returned error can't find the container with id 41b03f8fb3fe1c9f4bda8b479c7340958f271d599e4b1f5659006e3a7635a580 Apr 23 16:39:00.190254 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.190222 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" event={"ID":"d305ef47-e072-458a-b817-c133935e2157","Type":"ContainerStarted","Data":"1f1a0c46973c9dc4ec3baf0d80988358b02b6c2a1978fadac4526a3e8083200b"} Apr 23 16:39:00.191250 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.191227 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dtdh7" event={"ID":"8679184f-be15-4071-9ec1-92da472ff0d4","Type":"ContainerStarted","Data":"41b03f8fb3fe1c9f4bda8b479c7340958f271d599e4b1f5659006e3a7635a580"} Apr 23 16:39:00.192233 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:00.192213 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" event={"ID":"ec6c55dc-57f9-42dc-bd87-3cfa96e05185","Type":"ContainerStarted","Data":"3e1b5ab173cba132c373e8d34c5b71cbe0eef69408af0188137ed7fba49c91ed"} Apr 23 16:39:05.212894 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.212853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" event={"ID":"d305ef47-e072-458a-b817-c133935e2157","Type":"ContainerStarted","Data":"56a2347e39364e6f0a21751edabecbf64ad8615d30c3a71bb7d7e6032c06d476"} Apr 23 16:39:05.213433 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.212999 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:39:05.214235 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.214211 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dtdh7" event={"ID":"8679184f-be15-4071-9ec1-92da472ff0d4","Type":"ContainerStarted","Data":"d242a4cf153e0fe5e963baf58599560fe68edbade3b098425d9a777c91f1d426"} Apr 23 16:39:05.214363 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.214336 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:39:05.215395 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.215374 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" event={"ID":"ec6c55dc-57f9-42dc-bd87-3cfa96e05185","Type":"ContainerStarted","Data":"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8"} Apr 23 16:39:05.215479 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.215473 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:39:05.231236 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.231190 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" podStartSLOduration=1.687645442 podStartE2EDuration="6.231175847s" podCreationTimestamp="2026-04-23 16:38:59 +0000 UTC" firstStartedPulling="2026-04-23 16:39:00.131791786 +0000 UTC m=+250.394577697" lastFinishedPulling="2026-04-23 16:39:04.67532219 +0000 UTC m=+254.938108102" observedRunningTime="2026-04-23 16:39:05.229369883 +0000 UTC m=+255.492155834" watchObservedRunningTime="2026-04-23 16:39:05.231175847 +0000 UTC m=+255.493961777" Apr 23 16:39:05.252964 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.252914 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" podStartSLOduration=1.685626023 podStartE2EDuration="6.252900774s" podCreationTimestamp="2026-04-23 16:38:59 +0000 UTC" firstStartedPulling="2026-04-23 16:39:00.107450079 +0000 UTC m=+250.370235990" lastFinishedPulling="2026-04-23 16:39:04.674724816 +0000 UTC m=+254.937510741" observedRunningTime="2026-04-23 16:39:05.250215172 +0000 UTC m=+255.513001104" watchObservedRunningTime="2026-04-23 16:39:05.252900774 +0000 UTC m=+255.515686707" Apr 23 16:39:05.283769 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:05.283708 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dtdh7" podStartSLOduration=1.700853481 podStartE2EDuration="6.283686959s" podCreationTimestamp="2026-04-23 16:38:59 +0000 UTC" firstStartedPulling="2026-04-23 16:39:00.152252717 +0000 UTC m=+250.415038627" lastFinishedPulling="2026-04-23 16:39:04.73508619 +0000 UTC m=+254.997872105" observedRunningTime="2026-04-23 16:39:05.28240666 +0000 UTC m=+255.545192592" watchObservedRunningTime="2026-04-23 16:39:05.283686959 +0000 UTC m=+255.546472895" Apr 23 16:39:11.220907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:11.220821 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dtdh7" Apr 23 16:39:36.220993 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:36.220960 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-6f9zw" Apr 23 16:39:36.224155 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:36.224125 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:39:37.457289 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.457256 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:39:37.457709 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.457462 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" podUID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" containerName="manager" containerID="cri-o://d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8" gracePeriod=10 Apr 23 16:39:37.479827 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.479793 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-pjflc"] Apr 23 16:39:37.512549 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.512517 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-pjflc"] Apr 23 16:39:37.512705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.512690 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.558481 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.558449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-cert\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.558691 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.558507 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wxh\" (UniqueName: \"kubernetes.io/projected/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-kube-api-access-f8wxh\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.659205 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.659163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-cert\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.659370 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.659261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wxh\" (UniqueName: \"kubernetes.io/projected/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-kube-api-access-f8wxh\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.662231 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.662195 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-cert\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.668470 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.668441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wxh\" (UniqueName: \"kubernetes.io/projected/4f57cc79-4d5f-4f58-89e9-b66bce0b28fe-kube-api-access-f8wxh\") pod \"kserve-controller-manager-5b898d7b9d-pjflc\" (UID: \"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe\") " pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.719204 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.719177 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:39:37.760377 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.760340 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zngl\" (UniqueName: \"kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl\") pod \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " Apr 23 16:39:37.760541 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.760410 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert\") pod \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\" (UID: \"ec6c55dc-57f9-42dc-bd87-3cfa96e05185\") " Apr 23 16:39:37.762811 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.762779 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert" (OuterVolumeSpecName: "cert") pod "ec6c55dc-57f9-42dc-bd87-3cfa96e05185" (UID: "ec6c55dc-57f9-42dc-bd87-3cfa96e05185"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:37.762952 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.762879 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl" (OuterVolumeSpecName: "kube-api-access-4zngl") pod "ec6c55dc-57f9-42dc-bd87-3cfa96e05185" (UID: "ec6c55dc-57f9-42dc-bd87-3cfa96e05185"). InnerVolumeSpecName "kube-api-access-4zngl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:37.862004 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.861963 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:39:37.862004 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.861997 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zngl\" (UniqueName: \"kubernetes.io/projected/ec6c55dc-57f9-42dc-bd87-3cfa96e05185-kube-api-access-4zngl\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:39:37.866870 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.866834 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:37.998714 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:37.998686 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-pjflc"] Apr 23 16:39:38.001095 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:39:38.001063 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f57cc79_4d5f_4f58_89e9_b66bce0b28fe.slice/crio-1cce10ec182d0c93f89955f0b34c67c14db405d21650e42354f2cc888140a39e WatchSource:0}: Error finding container 1cce10ec182d0c93f89955f0b34c67c14db405d21650e42354f2cc888140a39e: Status 404 returned error can't find the container with id 1cce10ec182d0c93f89955f0b34c67c14db405d21650e42354f2cc888140a39e Apr 23 16:39:38.333917 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.333875 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" event={"ID":"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe","Type":"ContainerStarted","Data":"1cce10ec182d0c93f89955f0b34c67c14db405d21650e42354f2cc888140a39e"} Apr 23 16:39:38.334959 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.334931 2580 generic.go:358] "Generic (PLEG): container finished" podID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" containerID="d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8" exitCode=0 Apr 23 16:39:38.335096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.334998 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" Apr 23 16:39:38.335096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.335017 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" event={"ID":"ec6c55dc-57f9-42dc-bd87-3cfa96e05185","Type":"ContainerDied","Data":"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8"} Apr 23 16:39:38.335096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.335045 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-cvm6t" event={"ID":"ec6c55dc-57f9-42dc-bd87-3cfa96e05185","Type":"ContainerDied","Data":"3e1b5ab173cba132c373e8d34c5b71cbe0eef69408af0188137ed7fba49c91ed"} Apr 23 16:39:38.335096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.335059 2580 scope.go:117] "RemoveContainer" containerID="d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8" Apr 23 16:39:38.343370 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.343346 2580 scope.go:117] "RemoveContainer" containerID="d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8" Apr 23 16:39:38.343687 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:39:38.343670 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8\": container with ID starting with d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8 not found: ID does not exist" containerID="d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8" Apr 23 16:39:38.343736 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.343696 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8"} err="failed to get container status \"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8\": rpc error: code = NotFound desc = could not find container \"d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8\": container with ID starting with d0802d119ab99196246937fa18e86f37abe206325f5484086c8d0590653ba5c8 not found: ID does not exist" Apr 23 16:39:38.352904 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.352872 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:39:38.356960 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:38.356933 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-cvm6t"] Apr 23 16:39:39.340173 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:39.340129 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" event={"ID":"4f57cc79-4d5f-4f58-89e9-b66bce0b28fe","Type":"ContainerStarted","Data":"f3428aeaa9245bbf10bfc21569f85956541ded82c4577888089c8835f372fa50"} Apr 23 16:39:39.340753 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:39.340209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:39:39.357295 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:39.357239 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" podStartSLOduration=1.853333579 podStartE2EDuration="2.357221744s" podCreationTimestamp="2026-04-23 16:39:37 +0000 UTC" firstStartedPulling="2026-04-23 16:39:38.00229873 +0000 UTC m=+288.265084645" lastFinishedPulling="2026-04-23 16:39:38.506186895 +0000 UTC m=+288.768972810" observedRunningTime="2026-04-23 16:39:39.35661055 +0000 UTC m=+289.619396484" watchObservedRunningTime="2026-04-23 16:39:39.357221744 +0000 UTC m=+289.620007676" Apr 23 16:39:40.285613 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:40.282436 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" path="/var/lib/kubelet/pods/ec6c55dc-57f9-42dc-bd87-3cfa96e05185/volumes" Apr 23 16:39:50.156252 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:50.156222 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:39:50.156845 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:50.156536 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:39:50.164316 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:39:50.164291 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:10.350172 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:10.350140 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-pjflc" Apr 23 16:40:11.208622 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.208567 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-bn5gl"] Apr 23 16:40:11.208928 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.208913 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" containerName="manager" Apr 23 16:40:11.208971 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.208930 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" containerName="manager" Apr 23 16:40:11.209015 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.208989 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec6c55dc-57f9-42dc-bd87-3cfa96e05185" containerName="manager" Apr 23 16:40:11.211999 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.211967 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.214933 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.214901 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-2vxjm\"" Apr 23 16:40:11.215127 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.215113 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 16:40:11.220635 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.220610 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bn5gl"] Apr 23 16:40:11.223379 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.223348 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-ps4x9"] Apr 23 16:40:11.226967 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.226936 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.230229 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.230197 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 16:40:11.230391 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.230258 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bhw4l\"" Apr 23 16:40:11.239514 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.239477 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ps4x9"] Apr 23 16:40:11.349774 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.349737 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.349961 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.349801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5z2q\" (UniqueName: \"kubernetes.io/projected/9ad74a80-cfe8-463e-a0df-2b796c789b70-kube-api-access-j5z2q\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.349961 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.349866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6mg\" (UniqueName: \"kubernetes.io/projected/a034fa4d-2253-42ed-81a9-141faae62b3b-kube-api-access-lh6mg\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.349961 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.349910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.450595 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.450550 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.450660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5z2q\" (UniqueName: \"kubernetes.io/projected/9ad74a80-cfe8-463e-a0df-2b796c789b70-kube-api-access-j5z2q\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.450681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6mg\" (UniqueName: \"kubernetes.io/projected/a034fa4d-2253-42ed-81a9-141faae62b3b-kube-api-access-lh6mg\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.450705 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.450710 2580 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.450802 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs podName:a034fa4d-2253-42ed-81a9-141faae62b3b nodeName:}" failed. No retries permitted until 2026-04-23 16:40:11.950779332 +0000 UTC m=+322.213565242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs") pod "model-serving-api-86f7b4b499-bn5gl" (UID: "a034fa4d-2253-42ed-81a9-141faae62b3b") : secret "model-serving-api-tls" not found Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.450801 2580 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 16:40:11.450966 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.450850 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert podName:9ad74a80-cfe8-463e-a0df-2b796c789b70 nodeName:}" failed. No retries permitted until 2026-04-23 16:40:11.950834342 +0000 UTC m=+322.213620255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert") pod "odh-model-controller-696fc77849-ps4x9" (UID: "9ad74a80-cfe8-463e-a0df-2b796c789b70") : secret "odh-model-controller-webhook-cert" not found Apr 23 16:40:11.460322 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.460255 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6mg\" (UniqueName: \"kubernetes.io/projected/a034fa4d-2253-42ed-81a9-141faae62b3b-kube-api-access-lh6mg\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.460443 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.460353 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5z2q\" (UniqueName: \"kubernetes.io/projected/9ad74a80-cfe8-463e-a0df-2b796c789b70-kube-api-access-j5z2q\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.955849 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.955814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:11.956020 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.955922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:11.956066 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.956037 2580 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 16:40:11.956131 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:40:11.956120 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert podName:9ad74a80-cfe8-463e-a0df-2b796c789b70 nodeName:}" failed. No retries permitted until 2026-04-23 16:40:12.956099474 +0000 UTC m=+323.218885405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert") pod "odh-model-controller-696fc77849-ps4x9" (UID: "9ad74a80-cfe8-463e-a0df-2b796c789b70") : secret "odh-model-controller-webhook-cert" not found Apr 23 16:40:11.958719 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:11.958683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a034fa4d-2253-42ed-81a9-141faae62b3b-tls-certs\") pod \"model-serving-api-86f7b4b499-bn5gl\" (UID: \"a034fa4d-2253-42ed-81a9-141faae62b3b\") " pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:12.124521 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.124489 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:12.268145 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.268051 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bn5gl"] Apr 23 16:40:12.270863 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:40:12.270834 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda034fa4d_2253_42ed_81a9_141faae62b3b.slice/crio-dad306f53d87c5774633cb99b896ad561682e6efd989cd03195ecca2c2e9e964 WatchSource:0}: Error finding container dad306f53d87c5774633cb99b896ad561682e6efd989cd03195ecca2c2e9e964: Status 404 returned error can't find the container with id dad306f53d87c5774633cb99b896ad561682e6efd989cd03195ecca2c2e9e964 Apr 23 16:40:12.272526 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.272508 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:40:12.452416 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.452381 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bn5gl" event={"ID":"a034fa4d-2253-42ed-81a9-141faae62b3b","Type":"ContainerStarted","Data":"dad306f53d87c5774633cb99b896ad561682e6efd989cd03195ecca2c2e9e964"} Apr 23 16:40:12.962705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.962653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:12.965413 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:12.965384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ad74a80-cfe8-463e-a0df-2b796c789b70-cert\") pod \"odh-model-controller-696fc77849-ps4x9\" (UID: \"9ad74a80-cfe8-463e-a0df-2b796c789b70\") " pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:13.040460 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:13.040427 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:13.170146 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:13.170115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ps4x9"] Apr 23 16:40:13.172632 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:40:13.172598 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ad74a80_cfe8_463e_a0df_2b796c789b70.slice/crio-5e06017c3eaf0f74a92b9f31e054fc783cecd8f29d246e7c9a85326d429f83c8 WatchSource:0}: Error finding container 5e06017c3eaf0f74a92b9f31e054fc783cecd8f29d246e7c9a85326d429f83c8: Status 404 returned error can't find the container with id 5e06017c3eaf0f74a92b9f31e054fc783cecd8f29d246e7c9a85326d429f83c8 Apr 23 16:40:13.457066 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:13.457028 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ps4x9" event={"ID":"9ad74a80-cfe8-463e-a0df-2b796c789b70","Type":"ContainerStarted","Data":"5e06017c3eaf0f74a92b9f31e054fc783cecd8f29d246e7c9a85326d429f83c8"} Apr 23 16:40:15.467038 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:15.467001 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bn5gl" event={"ID":"a034fa4d-2253-42ed-81a9-141faae62b3b","Type":"ContainerStarted","Data":"bb421192490b84054943b2033a6b3e320608ec62c4d4b23911a7537e112d72a7"} Apr 23 16:40:15.467463 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:15.467151 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:15.488826 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:15.488781 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-bn5gl" podStartSLOduration=2.138343851 podStartE2EDuration="4.488765755s" podCreationTimestamp="2026-04-23 16:40:11 +0000 UTC" firstStartedPulling="2026-04-23 16:40:12.272658162 +0000 UTC m=+322.535444072" lastFinishedPulling="2026-04-23 16:40:14.623080052 +0000 UTC m=+324.885865976" observedRunningTime="2026-04-23 16:40:15.486616246 +0000 UTC m=+325.749402180" watchObservedRunningTime="2026-04-23 16:40:15.488765755 +0000 UTC m=+325.751551687" Apr 23 16:40:17.475681 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:17.475638 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ps4x9" event={"ID":"9ad74a80-cfe8-463e-a0df-2b796c789b70","Type":"ContainerStarted","Data":"1c12d165348e225cf12c045725f8c52d8e83fa627516768a7a43a180d9ba68b0"} Apr 23 16:40:17.476109 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:17.475876 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:17.496308 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:17.496260 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-ps4x9" podStartSLOduration=3.028377305 podStartE2EDuration="6.496246101s" podCreationTimestamp="2026-04-23 16:40:11 +0000 UTC" firstStartedPulling="2026-04-23 16:40:13.174329409 +0000 UTC m=+323.437115321" lastFinishedPulling="2026-04-23 16:40:16.642198204 +0000 UTC m=+326.904984117" observedRunningTime="2026-04-23 16:40:17.493458584 +0000 UTC m=+327.756244518" watchObservedRunningTime="2026-04-23 16:40:17.496246101 +0000 UTC m=+327.759032059" Apr 23 16:40:26.474994 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:26.474967 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-bn5gl" Apr 23 16:40:28.481424 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:28.481396 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-ps4x9" Apr 23 16:40:55.867422 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:55.867337 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55596bf557-ddzv5"] Apr 23 16:40:55.874865 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:55.874837 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:55.890290 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:55.890261 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55596bf557-ddzv5"] Apr 23 16:40:56.012185 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-service-ca\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012185 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-trusted-ca-bundle\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012414 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012213 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-oauth-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012414 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkmwg\" (UniqueName: \"kubernetes.io/projected/77120f66-0e26-41b8-97d7-bdbee45e9567-kube-api-access-bkmwg\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012414 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-oauth-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012414 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012400 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-console-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.012540 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.012424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.112954 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.112911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-oauth-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113159 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.112997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkmwg\" (UniqueName: \"kubernetes.io/projected/77120f66-0e26-41b8-97d7-bdbee45e9567-kube-api-access-bkmwg\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113159 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113044 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-oauth-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113159 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-console-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113159 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113097 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113159 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-service-ca\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113419 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113168 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-trusted-ca-bundle\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113810 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-oauth-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113923 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113863 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-service-ca\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.113995 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.113970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-console-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.114096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.114008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77120f66-0e26-41b8-97d7-bdbee45e9567-trusted-ca-bundle\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.115625 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.115605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-oauth-config\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.115775 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.115757 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77120f66-0e26-41b8-97d7-bdbee45e9567-console-serving-cert\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.123976 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.123878 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkmwg\" (UniqueName: \"kubernetes.io/projected/77120f66-0e26-41b8-97d7-bdbee45e9567-kube-api-access-bkmwg\") pod \"console-55596bf557-ddzv5\" (UID: \"77120f66-0e26-41b8-97d7-bdbee45e9567\") " pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.184874 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.184833 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:40:56.321512 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.321484 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55596bf557-ddzv5"] Apr 23 16:40:56.323657 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:40:56.323568 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77120f66_0e26_41b8_97d7_bdbee45e9567.slice/crio-7a4146c21d88578a4c75f9089ce726d497700f390df9794bd214e7607d282112 WatchSource:0}: Error finding container 7a4146c21d88578a4c75f9089ce726d497700f390df9794bd214e7607d282112: Status 404 returned error can't find the container with id 7a4146c21d88578a4c75f9089ce726d497700f390df9794bd214e7607d282112 Apr 23 16:40:56.616186 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.616149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55596bf557-ddzv5" event={"ID":"77120f66-0e26-41b8-97d7-bdbee45e9567","Type":"ContainerStarted","Data":"fbdf9b3151290ca36d6bd137e7a2319aacf9fe4a36d19f02371a9d62132ff4fd"} Apr 23 16:40:56.616186 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.616186 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55596bf557-ddzv5" event={"ID":"77120f66-0e26-41b8-97d7-bdbee45e9567","Type":"ContainerStarted","Data":"7a4146c21d88578a4c75f9089ce726d497700f390df9794bd214e7607d282112"} Apr 23 16:40:56.641339 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:40:56.641244 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55596bf557-ddzv5" podStartSLOduration=1.641230705 podStartE2EDuration="1.641230705s" podCreationTimestamp="2026-04-23 16:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:40:56.640875783 +0000 UTC m=+366.903661714" watchObservedRunningTime="2026-04-23 16:40:56.641230705 +0000 UTC m=+366.904016637" Apr 23 16:41:06.185870 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:06.185819 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:41:06.186402 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:06.186113 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:41:06.190837 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:06.190818 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:41:06.656646 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:06.656616 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55596bf557-ddzv5" Apr 23 16:41:06.716177 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:06.716145 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:41:31.741332 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:31.741286 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77956f5445-ks96n" podUID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" containerName="console" containerID="cri-o://629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af" gracePeriod=15 Apr 23 16:41:31.982974 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:31.982951 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77956f5445-ks96n_4f8578f4-f872-4adc-ab1b-d3ce1f3623c9/console/0.log" Apr 23 16:41:31.983107 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:31.983011 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:41:32.016074 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.015994 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016074 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016028 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016074 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016074 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016323 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016130 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016379 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016333 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016379 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016348 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca" (OuterVolumeSpecName: "service-ca") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:32.016478 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016378 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrgs\" (UniqueName: \"kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016478 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016418 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert\") pod \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\" (UID: \"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9\") " Apr 23 16:41:32.016570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016507 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:32.016570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016513 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config" (OuterVolumeSpecName: "console-config") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:32.016570 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016545 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:41:32.016749 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016730 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.016802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016755 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-service-ca\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.016802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016769 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-oauth-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.016802 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.016784 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-trusted-ca-bundle\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.018476 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.018455 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:41:32.018595 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.018553 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs" (OuterVolumeSpecName: "kube-api-access-tsrgs") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "kube-api-access-tsrgs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:41:32.018891 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.018873 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" (UID: "4f8578f4-f872-4adc-ab1b-d3ce1f3623c9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:41:32.117397 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.117362 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-oauth-config\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.117397 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.117391 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsrgs\" (UniqueName: \"kubernetes.io/projected/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-kube-api-access-tsrgs\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.117397 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.117401 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9-console-serving-cert\") on node \"ip-10-0-142-4.ec2.internal\" DevicePath \"\"" Apr 23 16:41:32.740362 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740333 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77956f5445-ks96n_4f8578f4-f872-4adc-ab1b-d3ce1f3623c9/console/0.log" Apr 23 16:41:32.740545 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740378 2580 generic.go:358] "Generic (PLEG): container finished" podID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" containerID="629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af" exitCode=2 Apr 23 16:41:32.740545 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740452 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77956f5445-ks96n" Apr 23 16:41:32.740545 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77956f5445-ks96n" event={"ID":"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9","Type":"ContainerDied","Data":"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af"} Apr 23 16:41:32.740545 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77956f5445-ks96n" event={"ID":"4f8578f4-f872-4adc-ab1b-d3ce1f3623c9","Type":"ContainerDied","Data":"0b5b7c57414fb3cecf9a2e7c7fa83947a8d4181a17aa2d0f7c16b1d427cfa5e7"} Apr 23 16:41:32.740545 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.740512 2580 scope.go:117] "RemoveContainer" containerID="629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af" Apr 23 16:41:32.748707 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.748499 2580 scope.go:117] "RemoveContainer" containerID="629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af" Apr 23 16:41:32.749011 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:41:32.748783 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af\": container with ID starting with 629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af not found: ID does not exist" containerID="629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af" Apr 23 16:41:32.749011 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.748810 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af"} err="failed to get container status \"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af\": rpc error: code = NotFound desc = could not find container \"629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af\": container with ID starting with 629f8cf0753fe448c50c0cea546615810f27951fac55b08b103d87e433ea04af not found: ID does not exist" Apr 23 16:41:32.758759 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.758734 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:41:32.765150 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:32.765127 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77956f5445-ks96n"] Apr 23 16:41:34.282622 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:41:34.282517 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" path="/var/lib/kubelet/pods/4f8578f4-f872-4adc-ab1b-d3ce1f3623c9/volumes" Apr 23 16:44:50.178295 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:44:50.178267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:44:50.180209 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:44:50.180179 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:45:44.942234 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.942199 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:45:44.942744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.942593 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" containerName="console" Apr 23 16:45:44.942744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.942608 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" containerName="console" Apr 23 16:45:44.942744 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.942676 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f8578f4-f872-4adc-ab1b-d3ce1f3623c9" containerName="console" Apr 23 16:45:44.945529 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.945509 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:45:44.948384 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.948360 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-p7xp4\"" Apr 23 16:45:44.955967 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.955946 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:45:44.965263 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:44.965235 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:45:45.088734 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:45.088701 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:45:45.092381 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:45:45.092351 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15456bfd_de85_4f11_9f9e_2d96a2c87151.slice/crio-c63f88a6b48d55c98f1b225aa10dec63934b9059daffb582ed1b0edd50bc5667 WatchSource:0}: Error finding container c63f88a6b48d55c98f1b225aa10dec63934b9059daffb582ed1b0edd50bc5667: Status 404 returned error can't find the container with id c63f88a6b48d55c98f1b225aa10dec63934b9059daffb582ed1b0edd50bc5667 Apr 23 16:45:45.094050 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:45.094034 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:45:45.568639 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:45.568604 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" event={"ID":"15456bfd-de85-4f11-9f9e-2d96a2c87151","Type":"ContainerStarted","Data":"c63f88a6b48d55c98f1b225aa10dec63934b9059daffb582ed1b0edd50bc5667"} Apr 23 16:45:46.573399 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:46.573362 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" event={"ID":"15456bfd-de85-4f11-9f9e-2d96a2c87151","Type":"ContainerStarted","Data":"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857"} Apr 23 16:45:46.573969 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:46.573586 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:45:46.575293 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:46.575274 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:45:46.590825 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:45:46.590778 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" podStartSLOduration=1.602607589 podStartE2EDuration="2.590764618s" podCreationTimestamp="2026-04-23 16:45:44 +0000 UTC" firstStartedPulling="2026-04-23 16:45:45.094157726 +0000 UTC m=+655.356943636" lastFinishedPulling="2026-04-23 16:45:46.082314751 +0000 UTC m=+656.345100665" observedRunningTime="2026-04-23 16:45:46.588679731 +0000 UTC m=+656.851465663" watchObservedRunningTime="2026-04-23 16:45:46.590764618 +0000 UTC m=+656.853550549" Apr 23 16:47:09.942989 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:09.942959 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w_15456bfd-de85-4f11-9f9e-2d96a2c87151/kserve-container/0.log" Apr 23 16:47:10.250725 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.250691 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:47:10.250971 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.250940 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" podUID="15456bfd-de85-4f11-9f9e-2d96a2c87151" containerName="kserve-container" containerID="cri-o://9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857" gracePeriod=30 Apr 23 16:47:10.503600 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.503506 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:47:10.861003 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.860899 2580 generic.go:358] "Generic (PLEG): container finished" podID="15456bfd-de85-4f11-9f9e-2d96a2c87151" containerID="9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857" exitCode=2 Apr 23 16:47:10.861003 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.860976 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" Apr 23 16:47:10.861003 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.860992 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" event={"ID":"15456bfd-de85-4f11-9f9e-2d96a2c87151","Type":"ContainerDied","Data":"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857"} Apr 23 16:47:10.861275 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.861038 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w" event={"ID":"15456bfd-de85-4f11-9f9e-2d96a2c87151","Type":"ContainerDied","Data":"c63f88a6b48d55c98f1b225aa10dec63934b9059daffb582ed1b0edd50bc5667"} Apr 23 16:47:10.861275 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.861058 2580 scope.go:117] "RemoveContainer" containerID="9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857" Apr 23 16:47:10.869603 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.869565 2580 scope.go:117] "RemoveContainer" containerID="9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857" Apr 23 16:47:10.869908 ip-10-0-142-4 kubenswrapper[2580]: E0423 16:47:10.869888 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857\": container with ID starting with 9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857 not found: ID does not exist" containerID="9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857" Apr 23 16:47:10.869968 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.869925 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857"} err="failed to get container status \"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857\": rpc error: code = NotFound desc = could not find container \"9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857\": container with ID starting with 9b134ec1b6e2986406bb4548db3a1e866b4049f6bd939832f81ba31ae911f857 not found: ID does not exist" Apr 23 16:47:10.881705 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.881672 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:47:10.885517 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:10.885487 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-884e2-predictor-6d78bc8b8c-l4t8w"] Apr 23 16:47:12.282357 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:47:12.282317 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15456bfd-de85-4f11-9f9e-2d96a2c87151" path="/var/lib/kubelet/pods/15456bfd-de85-4f11-9f9e-2d96a2c87151/volumes" Apr 23 16:49:50.200409 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:49:50.200381 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:49:50.203122 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:49:50.203099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:54:04.732868 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:04.732841 2580 ???:1] "http: TLS handshake error from 10.0.133.231:52062: EOF" Apr 23 16:54:04.737881 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:04.737843 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7lg5k_4b505881-5503-4e1f-b72b-0d8abde1a5e0/global-pull-secret-syncer/0.log" Apr 23 16:54:04.887306 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:04.887221 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cdgwc_7f20dda8-0907-46fa-84c0-d1304b1105df/konnectivity-agent/0.log" Apr 23 16:54:05.015367 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:05.015332 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-4.ec2.internal_6ca4686b7542221f742a8d14ace1d047/haproxy/0.log" Apr 23 16:54:08.422624 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.422518 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8ql4_4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2/kube-state-metrics/0.log" Apr 23 16:54:08.443882 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.443845 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8ql4_4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2/kube-rbac-proxy-main/0.log" Apr 23 16:54:08.464019 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.463986 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-q8ql4_4abd8a2b-c2da-41d7-8806-7ccb0fdbeae2/kube-rbac-proxy-self/0.log" Apr 23 16:54:08.497319 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.497267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c7c78bc5d-5knsj_e32dc24f-77a9-4c9d-a568-3e7866f08632/metrics-server/0.log" Apr 23 16:54:08.520373 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.520344 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-tvs8x_241fa968-de71-4d41-bb5d-d2886ebb5366/monitoring-plugin/0.log" Apr 23 16:54:08.670800 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.670758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pvhzm_f8cb88e9-6f22-4927-807b-b213102a45ed/node-exporter/0.log" Apr 23 16:54:08.693439 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.693368 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pvhzm_f8cb88e9-6f22-4927-807b-b213102a45ed/kube-rbac-proxy/0.log" Apr 23 16:54:08.716078 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:08.716053 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pvhzm_f8cb88e9-6f22-4927-807b-b213102a45ed/init-textfile/0.log" Apr 23 16:54:09.167597 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.167536 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-547d785944-t6rr7_12e1e988-537f-4918-b283-86b348f0c63f/telemeter-client/0.log" Apr 23 16:54:09.186973 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.186944 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-547d785944-t6rr7_12e1e988-537f-4918-b283-86b348f0c63f/reload/0.log" Apr 23 16:54:09.211448 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.211423 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-547d785944-t6rr7_12e1e988-537f-4918-b283-86b348f0c63f/kube-rbac-proxy/0.log" Apr 23 16:54:09.247713 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.247684 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/thanos-query/0.log" Apr 23 16:54:09.268782 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.268753 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/kube-rbac-proxy-web/0.log" Apr 23 16:54:09.289836 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.289806 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/kube-rbac-proxy/0.log" Apr 23 16:54:09.312457 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.312425 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/prom-label-proxy/0.log" Apr 23 16:54:09.332367 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.332340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/kube-rbac-proxy-rules/0.log" Apr 23 16:54:09.356323 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:09.356292 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d749c5fc6-5sfqd_7e9f2050-7154-45e4-8941-45715943a2f9/kube-rbac-proxy-metrics/0.log" Apr 23 16:54:11.301490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.301463 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55596bf557-ddzv5_77120f66-0e26-41b8-97d7-bdbee45e9567/console/0.log" Apr 23 16:54:11.332902 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.332869 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7zcwv_e2859776-cf83-49ed-ac30-69c2ec6863e5/download-server/0.log" Apr 23 16:54:11.904110 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.904071 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh"] Apr 23 16:54:11.904504 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.904479 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15456bfd-de85-4f11-9f9e-2d96a2c87151" containerName="kserve-container" Apr 23 16:54:11.904664 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.904531 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="15456bfd-de85-4f11-9f9e-2d96a2c87151" containerName="kserve-container" Apr 23 16:54:11.904727 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.904695 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="15456bfd-de85-4f11-9f9e-2d96a2c87151" containerName="kserve-container" Apr 23 16:54:11.907835 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.907814 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:11.910372 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.910353 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"openshift-service-ca.crt\"" Apr 23 16:54:11.910509 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.910353 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"kube-root-ca.crt\"" Apr 23 16:54:11.911403 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.911388 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gwvnc\"/\"default-dockercfg-kgzpm\"" Apr 23 16:54:11.916992 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:11.916960 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh"] Apr 23 16:54:12.016521 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.016477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgpz4\" (UniqueName: \"kubernetes.io/projected/2aecc60a-69c3-4fbf-8500-162e3490216c-kube-api-access-kgpz4\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.016746 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.016535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-sys\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.016746 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.016610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-podres\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.016746 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.016630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-proc\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.016746 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.016716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-lib-modules\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117644 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117569 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-sys\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117644 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117661 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-podres\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-proc\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-sys\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-lib-modules\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgpz4\" (UniqueName: \"kubernetes.io/projected/2aecc60a-69c3-4fbf-8500-162e3490216c-kube-api-access-kgpz4\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-podres\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.117907 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117842 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-proc\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.118091 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.117921 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2aecc60a-69c3-4fbf-8500-162e3490216c-lib-modules\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.126757 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.126720 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgpz4\" (UniqueName: \"kubernetes.io/projected/2aecc60a-69c3-4fbf-8500-162e3490216c-kube-api-access-kgpz4\") pod \"perf-node-gather-daemonset-s7ckh\" (UID: \"2aecc60a-69c3-4fbf-8500-162e3490216c\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.218676 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.218649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:12.348780 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.348749 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh"] Apr 23 16:54:12.350820 ip-10-0-142-4 kubenswrapper[2580]: W0423 16:54:12.350788 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2aecc60a_69c3_4fbf_8500_162e3490216c.slice/crio-7a02aaec5517bab66eb5db6ab1c7e5324e72208ed97c6f9bd86297efb11867ec WatchSource:0}: Error finding container 7a02aaec5517bab66eb5db6ab1c7e5324e72208ed97c6f9bd86297efb11867ec: Status 404 returned error can't find the container with id 7a02aaec5517bab66eb5db6ab1c7e5324e72208ed97c6f9bd86297efb11867ec Apr 23 16:54:12.352284 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.352269 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:54:12.487525 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.487448 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vsd8g_cda3ba79-1ac0-44e6-99b3-60c2f4882479/dns/0.log" Apr 23 16:54:12.508394 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.508362 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vsd8g_cda3ba79-1ac0-44e6-99b3-60c2f4882479/kube-rbac-proxy/0.log" Apr 23 16:54:12.530490 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:12.530464 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-78hbm_325c692e-07b3-4dcc-984b-733489080887/dns-node-resolver/0.log" Apr 23 16:54:13.013027 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:13.012985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rshv8_95e238a6-f3c9-4b3a-a7de-1bec7cf6b287/node-ca/0.log" Apr 23 16:54:13.278285 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:13.278193 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" event={"ID":"2aecc60a-69c3-4fbf-8500-162e3490216c","Type":"ContainerStarted","Data":"8885bbae711ae51c8579428b822b44a32fd752392025bba262e51f4f1dd4163d"} Apr 23 16:54:13.278285 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:13.278234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" event={"ID":"2aecc60a-69c3-4fbf-8500-162e3490216c","Type":"ContainerStarted","Data":"7a02aaec5517bab66eb5db6ab1c7e5324e72208ed97c6f9bd86297efb11867ec"} Apr 23 16:54:13.278285 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:13.278276 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:13.295630 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:13.295567 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" podStartSLOduration=2.29554964 podStartE2EDuration="2.29554964s" podCreationTimestamp="2026-04-23 16:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:13.295046617 +0000 UTC m=+1163.557832566" watchObservedRunningTime="2026-04-23 16:54:13.29554964 +0000 UTC m=+1163.558335568" Apr 23 16:54:14.060355 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:14.060325 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d4d84_b45de36f-3b6e-4c0e-a301-4fa4d410f228/serve-healthcheck-canary/0.log" Apr 23 16:54:14.631125 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:14.631091 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pmxx2_2846178e-072d-415f-9774-a498aa844964/kube-rbac-proxy/0.log" Apr 23 16:54:14.652591 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:14.652499 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pmxx2_2846178e-072d-415f-9774-a498aa844964/exporter/0.log" Apr 23 16:54:14.673676 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:14.673648 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pmxx2_2846178e-072d-415f-9774-a498aa844964/extractor/0.log" Apr 23 16:54:16.548398 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:16.548367 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-5b898d7b9d-pjflc_4f57cc79-4d5f-4f58-89e9-b66bce0b28fe/manager/0.log" Apr 23 16:54:16.566633 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:16.566606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6b94ff949c-6f9zw_d305ef47-e072-458a-b817-c133935e2157/manager/0.log" Apr 23 16:54:16.602778 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:16.602750 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-bn5gl_a034fa4d-2253-42ed-81a9-141faae62b3b/server/0.log" Apr 23 16:54:16.730962 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:16.730925 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-ps4x9_9ad74a80-cfe8-463e-a0df-2b796c789b70/manager/0.log" Apr 23 16:54:16.776971 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:16.776908 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dtdh7_8679184f-be15-4071-9ec1-92da472ff0d4/seaweedfs/0.log" Apr 23 16:54:19.292383 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:19.292351 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-s7ckh" Apr 23 16:54:21.768921 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.768891 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8rnmk_76787c69-1999-41dd-9713-d68801605aa8/kube-multus/0.log" Apr 23 16:54:21.793262 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.793192 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/kube-multus-additional-cni-plugins/0.log" Apr 23 16:54:21.818526 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.818502 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/egress-router-binary-copy/0.log" Apr 23 16:54:21.841515 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.841487 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/cni-plugins/0.log" Apr 23 16:54:21.862406 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.862379 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/bond-cni-plugin/0.log" Apr 23 16:54:21.882426 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.882399 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/routeoverride-cni/0.log" Apr 23 16:54:21.902342 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.902318 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/whereabouts-cni-bincopy/0.log" Apr 23 16:54:21.922774 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:21.922749 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-58q6m_9696cbb9-a1db-4ead-914d-e2d11faa33b6/whereabouts-cni/0.log" Apr 23 16:54:22.304096 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:22.304066 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f889w_67b8cec4-f05e-4ef7-9456-915dfa5c7554/network-metrics-daemon/0.log" Apr 23 16:54:22.321596 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:22.321547 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f889w_67b8cec4-f05e-4ef7-9456-915dfa5c7554/kube-rbac-proxy/0.log" Apr 23 16:54:23.733274 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.733240 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-controller/0.log" Apr 23 16:54:23.749789 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.749760 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/0.log" Apr 23 16:54:23.760198 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.760171 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovn-acl-logging/1.log" Apr 23 16:54:23.779025 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.778997 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/kube-rbac-proxy-node/0.log" Apr 23 16:54:23.799113 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.799083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 16:54:23.819204 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.819165 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/northd/0.log" Apr 23 16:54:23.838800 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.838763 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/nbdb/0.log" Apr 23 16:54:23.859059 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:23.858999 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/sbdb/0.log" Apr 23 16:54:24.013927 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:24.013845 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xvn7t_84c993c8-4dd2-40dc-b624-68a9f75a89cb/ovnkube-controller/0.log" Apr 23 16:54:24.948993 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:24.948961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7bn2z_38d83fc0-30d4-48d7-8aee-f7afaa404c2e/network-check-target-container/0.log" Apr 23 16:54:25.850436 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:25.850408 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qg8fj_58db2d9c-607b-4549-8e61-e385991f3a16/iptables-alerter/0.log" Apr 23 16:54:26.439552 ip-10-0-142-4 kubenswrapper[2580]: I0423 16:54:26.439520 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-kwzbs_f3054782-d344-49bd-865a-493a82cdebb1/tuned/0.log"