Apr 24 21:24:55.526554 ip-10-0-129-36 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:55.526569 ip-10-0-129-36 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:55.526580 ip-10-0-129-36 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:55.526910 ip-10-0-129-36 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:05.750590 ip-10-0-129-36 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:05.750610 ip-10-0-129-36 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 24e68279826348c2bf377cbd054dfc9c -- Apr 24 21:27:26.257572 ip-10-0-129-36 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:26.739317 ip-10-0-129-36 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:26.739317 ip-10-0-129-36 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:26.740114 ip-10-0-129-36 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:26.740114 ip-10-0-129-36 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:26.740114 ip-10-0-129-36 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:26.744631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.744533 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:26.747959 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747944 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747960 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747965 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747970 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747973 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747977 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747982 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747986 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747989 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747992 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747995 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:26.747994 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.747998 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748001 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748004 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748007 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748009 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748013 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748016 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748020 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748026 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748029 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748032 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748034 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748037 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748040 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748042 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748045 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748047 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748050 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748054 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748056 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:26.748313 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748059 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748061 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748064 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748066 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748069 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748072 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748074 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748078 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748080 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748083 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748086 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748090 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748092 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748095 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748099 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748102 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748104 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748107 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748110 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748112 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:26.748783 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748115 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748117 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748120 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748123 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748125 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748129 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748133 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748136 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748138 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748141 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748144 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748147 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748149 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748152 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748154 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748157 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748159 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748162 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748164 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748167 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:26.749288 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748169 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748173 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748176 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748178 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748182 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748185 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748188 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748191 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748193 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748196 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748199 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748201 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748204 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748206 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748209 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748644 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748649 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748653 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748656 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748658 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:26.749771 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748661 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748664 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748669 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748672 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748676 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748679 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748682 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748685 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748688 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748690 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748693 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748696 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748699 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748701 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748704 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748707 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748710 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748713 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748717 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:26.750260 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748720 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748723 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748725 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748729 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748731 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748733 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748736 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748738 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748741 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748743 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748745 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748748 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748750 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748753 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748755 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748758 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748760 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748763 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748765 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748768 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:26.750727 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748770 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748773 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748775 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748778 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748780 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748783 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748785 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748788 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748791 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748794 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748797 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748802 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748804 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748807 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748810 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748812 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748815 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748817 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748820 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:26.751220 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748822 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748825 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748827 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748830 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748832 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748835 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748837 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748841 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748844 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748848 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748850 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748853 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748856 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748858 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748861 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748863 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748866 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748868 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748871 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748874 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:26.751701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748877 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748879 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.748882 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749829 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749846 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749854 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749858 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749863 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749867 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749872 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749877 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749880 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749883 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749887 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749891 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749894 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749897 2577 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749901 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749904 2577 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749906 2577 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749909 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749912 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749917 2577 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749919 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:26.752211 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749922 2577 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749925 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749929 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749932 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749935 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749938 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749941 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749944 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749948 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749950 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749954 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749957 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749961 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749965 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749968 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749971 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749974 2577 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749977 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749982 2577 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749986 2577 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749989 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749992 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749995 2577 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.749999 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:26.752881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750002 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750005 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750009 2577 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750011 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750014 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750017 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750020 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750023 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750026 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750029 2577 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750033 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750036 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750039 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750042 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750045 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750048 2577 flags.go:64] FLAG: --help="false" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750051 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-129-36.ec2.internal" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750054 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750058 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750061 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750065 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750068 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750072 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:26.753472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750075 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750078 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750081 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750084 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750087 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750090 2577 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750093 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750096 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750099 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750101 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750104 2577 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750107 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750110 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750113 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750119 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750122 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750124 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750127 2577 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750130 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750134 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750137 2577 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750139 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750144 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750147 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750151 2577 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:26.754064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750154 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750157 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750160 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750164 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750167 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750170 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750173 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750181 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750185 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750188 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750191 2577 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750194 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750200 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750203 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750206 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750209 2577 flags.go:64] FLAG: --port="10250" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750212 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750215 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c48fc6b3fd677a3e" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750218 2577 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750221 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750224 2577 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750228 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750231 2577 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750234 2577 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:26.754692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750237 2577 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750241 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750244 2577 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750261 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750264 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750268 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750271 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750274 2577 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750277 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750280 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750283 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750286 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750290 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750293 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750296 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750299 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750302 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750306 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750309 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750312 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750315 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750318 2577 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750321 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750326 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750329 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:26.755318 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750336 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750341 2577 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750344 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750347 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750350 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750353 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750356 2577 flags.go:64] FLAG: --v="2" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750360 2577 flags.go:64] FLAG: --version="false" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750364 2577 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750372 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750376 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750492 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750496 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750500 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750503 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750506 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750509 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750512 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750515 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750518 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750521 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750523 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750527 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:26.756053 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750530 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750533 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750536 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750561 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750565 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750569 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750573 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750576 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750578 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750583 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750586 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750589 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750591 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750594 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750598 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750602 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750607 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750610 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750613 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:26.757024 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750616 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750619 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750622 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750625 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750628 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750630 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750633 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750636 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750639 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750641 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750644 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750647 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750650 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750652 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750655 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750658 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750660 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750663 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750666 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:26.757880 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750668 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750671 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750674 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750678 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750680 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750683 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750685 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750688 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750690 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750693 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750696 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750698 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750700 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750703 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750705 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750708 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750710 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750713 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750715 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750718 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:26.758396 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750720 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750724 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750727 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750730 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750733 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750736 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750739 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750741 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750744 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750747 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750750 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750753 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750756 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750758 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750761 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:26.759057 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.750764 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.750769 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.758735 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.758760 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758839 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758848 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758853 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758858 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758862 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758866 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758871 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758876 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758880 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758884 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758888 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758893 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:26.759747 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758898 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758905 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758912 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758917 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758921 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758925 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758929 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758934 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758938 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758943 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758959 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758963 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758967 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758972 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758976 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758981 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758986 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758990 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758995 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:26.760317 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.758999 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759005 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759010 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759014 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759019 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759024 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759028 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759033 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759037 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759041 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759045 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759049 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759054 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759058 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759063 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759069 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759073 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759083 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759086 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:26.760860 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759090 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759095 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759099 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759103 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759107 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759112 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759117 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759121 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759125 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759129 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759133 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759138 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759142 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759148 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759152 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759157 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759161 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759165 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759169 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759174 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:26.761643 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759179 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759183 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759187 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759192 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759196 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759200 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759204 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759208 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759212 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759216 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759220 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759224 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759228 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759233 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759238 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:26.762495 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759242 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.759264 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759428 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759436 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759440 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759445 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759449 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759453 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759458 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759462 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759468 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759472 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759478 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759484 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759488 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:26.762856 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759492 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759497 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759501 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759505 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759509 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759513 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759518 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759521 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759526 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759530 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759534 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759538 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759542 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759546 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759550 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759555 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759559 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759563 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759567 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759571 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:26.763392 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759576 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759581 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759585 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759589 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759593 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759597 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759601 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759606 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759611 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759615 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759619 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759624 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759628 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759633 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759637 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759641 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759646 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759650 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759654 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759658 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:26.763890 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759662 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759666 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759670 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759674 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759679 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759683 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759687 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759691 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759695 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759699 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759703 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759707 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759711 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759715 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759720 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759725 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759729 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759733 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759740 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:26.764491 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759746 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759750 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759755 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759761 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759766 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759770 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759774 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759779 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759784 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759788 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759792 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759796 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759800 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:26.759805 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.759813 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:26.764956 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.760487 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:26.765448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.763004 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:26.765448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.764185 2577 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:26.765448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.764290 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:26.765448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.765169 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:26.793672 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.793635 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:26.797177 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.797152 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:26.814587 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.814552 2577 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:26.821299 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.821280 2577 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:26.822053 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.822030 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:26.822673 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.822658 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:26.826069 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.826049 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b3ea5229-808d-403c-b8fc-de2458cc5cd3:/dev/nvme0n1p3 c1a46019-e802-4ac4-a04b-5e15e8e14161:/dev/nvme0n1p4] Apr 24 21:27:26.826135 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.826068 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:26.831862 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.831746 2577 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:26.829906495 +0000 UTC m=+0.444176768 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3112505 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f7fef8fa4fa791e617dc617fda63a SystemUUID:ec2f7fef-8fa4-fa79-1e61-7dc617fda63a BootID:24e68279-8263-48c2-bf37-7cbd054dfc9c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:58:aa:0d:e6:07 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:58:aa:0d:e6:07 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:10:72:cf:d0:ff Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:26.831862 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.831855 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:26.831983 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.831939 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:26.832981 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.832948 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:26.833129 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.832981 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:26.833171 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.833137 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:26.833171 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.833150 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:26.833171 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.833167 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:26.834263 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.834238 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:26.835541 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.835530 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:26.835834 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.835825 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:26.838359 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.838347 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:26.838401 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.838368 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:26.838401 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.838380 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:26.838487 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.838402 2577 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:26.838487 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.838415 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:26.839554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.839538 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:26.839554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.839557 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:26.842589 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.842572 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:26.844351 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.844338 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:26.846353 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846336 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:26.846353 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846354 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846360 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846366 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846372 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846377 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846390 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846399 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846406 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846412 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846426 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:26.846459 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.846435 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:26.847902 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.847893 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:26.847902 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.847902 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:26.851374 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851361 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:26.851423 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851399 2577 server.go:1295] "Started kubelet" Apr 24 21:27:26.851575 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851524 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:26.851630 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851596 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:26.851630 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851586 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:26.852195 ip-10-0-129-36 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:26.853236 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.851623 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:26.853536 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.853499 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:26.853661 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.853501 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:26.855920 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.855901 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:26.856192 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.856176 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:26.861163 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.860993 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:26.861520 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.861499 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:26.862488 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.862451 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:26.862488 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.861077 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-36.ec2.internal.18a9682780fbd8e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-36.ec2.internal,UID:ip-10-0-129-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-36.ec2.internal,},FirstTimestamp:2026-04-24 21:27:26.851373286 +0000 UTC m=+0.465643556,LastTimestamp:2026-04-24 21:27:26.851373286 +0000 UTC m=+0.465643556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-36.ec2.internal,}" Apr 24 21:27:26.862655 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862507 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:26.862655 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862525 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:26.862655 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.862579 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:26.862655 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862623 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:26.862832 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862666 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:26.862832 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862675 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:26.862832 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862677 2577 factory.go:55] Registering systemd factory Apr 24 21:27:26.862832 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862738 2577 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:26.863011 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.862997 2577 factory.go:153] Registering CRI-O factory Apr 24 21:27:26.863059 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.863011 2577 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:26.863109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.863069 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:26.863109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.863091 2577 factory.go:103] Registering Raw factory Apr 24 21:27:26.863109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.863108 2577 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:26.863586 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.863568 2577 manager.go:319] Starting recovery of all containers Apr 24 21:27:26.867854 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.867830 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-whs2f" Apr 24 21:27:26.872548 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.872394 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:26.872632 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.872409 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:26.873834 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.873816 2577 manager.go:324] Recovery completed Apr 24 21:27:26.877907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.877891 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-whs2f" Apr 24 21:27:26.877970 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.877942 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:26.880284 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880269 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:26.880342 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880299 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:26.880342 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880311 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:26.880859 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880846 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:26.880859 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880857 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:26.880941 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.880872 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:26.882739 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.882726 2577 policy_none.go:49] "None policy: Start" Apr 24 21:27:26.882780 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.882742 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:26.882780 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.882752 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:26.926833 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.926812 2577 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.926844 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.926855 2577 server.go:85] "Starting device plugin registration server" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.927105 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.927117 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.927205 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.927334 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.927344 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.929818 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:26.935219 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.929853 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:26.965083 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.965054 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:26.966180 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.966167 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:26.966241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.966191 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:26.966241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.966210 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:26.966241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.966218 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:26.966410 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:26.966268 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:26.969752 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:26.969728 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:27.028107 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.028037 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.029217 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.029200 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.029293 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.029234 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.029293 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.029266 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.029360 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.029297 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.039150 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.039130 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.039203 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.039156 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-36.ec2.internal\": node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.057219 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.057192 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.067313 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.067284 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal"] Apr 24 21:27:27.067366 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.067352 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.068176 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.068160 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.068274 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.068193 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.068274 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.068206 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.069293 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069281 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.069438 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.069473 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069454 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.069972 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.070031 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069972 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.070031 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069987 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.070031 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.070002 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.070135 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.069991 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.070181 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.070141 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.071164 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.071148 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.071236 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.071176 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.071858 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.071842 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.071948 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.071866 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.071948 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.071879 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.095977 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.095954 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-36.ec2.internal\" not found" node="ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.100333 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.100316 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-36.ec2.internal\" not found" node="ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.158090 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.158056 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.164773 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.164750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.164879 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.164781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.164879 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.164800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46c159a6e67f71d6ddbcaa845877ef38-config\") pod \"kube-apiserver-proxy-ip-10-0-129-36.ec2.internal\" (UID: \"46c159a6e67f71d6ddbcaa845877ef38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.258385 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.258330 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.265794 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.265910 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.265910 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46c159a6e67f71d6ddbcaa845877ef38-config\") pod \"kube-apiserver-proxy-ip-10-0-129-36.ec2.internal\" (UID: \"46c159a6e67f71d6ddbcaa845877ef38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.265910 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.265910 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46c159a6e67f71d6ddbcaa845877ef38-config\") pod \"kube-apiserver-proxy-ip-10-0-129-36.ec2.internal\" (UID: \"46c159a6e67f71d6ddbcaa845877ef38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.266090 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.265925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5803b248a78a545dbea4248274c87b99-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal\" (UID: \"5803b248a78a545dbea4248274c87b99\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.359212 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.359177 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.398780 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.398752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.403299 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.403277 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:27.460015 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.459966 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.560498 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.560453 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.661115 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.661049 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.734261 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.734216 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:27.761548 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.761516 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.763685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.763670 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:27.763834 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.763815 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:27.763870 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.763842 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:27.861317 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.861287 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:27.861737 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.861719 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.875155 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.875128 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:27.880600 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.880564 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:26 +0000 UTC" deadline="2027-11-27 21:34:56.539110027 +0000 UTC" Apr 24 21:27:27.880600 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.880591 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13968h7m28.65852174s" Apr 24 21:27:27.881370 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:27.881338 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5803b248a78a545dbea4248274c87b99.slice/crio-bbcf98368708cf568116f541c50b9b54301121dc18b1ee247a0cc77a1381bffd WatchSource:0}: Error finding container bbcf98368708cf568116f541c50b9b54301121dc18b1ee247a0cc77a1381bffd: Status 404 returned error can't find the container with id bbcf98368708cf568116f541c50b9b54301121dc18b1ee247a0cc77a1381bffd Apr 24 21:27:27.881795 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:27.881774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c159a6e67f71d6ddbcaa845877ef38.slice/crio-b0036a64393e2af52d56d96d6d8357aa509ac25c9792cb600f50a90947062c69 WatchSource:0}: Error finding container b0036a64393e2af52d56d96d6d8357aa509ac25c9792cb600f50a90947062c69: Status 404 returned error can't find the container with id b0036a64393e2af52d56d96d6d8357aa509ac25c9792cb600f50a90947062c69 Apr 24 21:27:27.885661 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.885647 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:27.897870 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.897854 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qqc9h" Apr 24 21:27:27.905534 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.905516 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qqc9h" Apr 24 21:27:27.961989 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:27.961915 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:27.969111 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.969071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" event={"ID":"46c159a6e67f71d6ddbcaa845877ef38","Type":"ContainerStarted","Data":"b0036a64393e2af52d56d96d6d8357aa509ac25c9792cb600f50a90947062c69"} Apr 24 21:27:27.970094 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:27.970072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" event={"ID":"5803b248a78a545dbea4248274c87b99","Type":"ContainerStarted","Data":"bbcf98368708cf568116f541c50b9b54301121dc18b1ee247a0cc77a1381bffd"} Apr 24 21:27:28.012346 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.012320 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:28.062257 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.062214 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:28.162790 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.162756 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:28.263239 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.263157 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-36.ec2.internal\" not found" Apr 24 21:27:28.281901 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.281868 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:28.363238 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.363205 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" Apr 24 21:27:28.378586 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.378556 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:28.379587 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.379566 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" Apr 24 21:27:28.385624 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.385591 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:28.840219 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.840170 2577 apiserver.go:52] "Watching apiserver" Apr 24 21:27:28.850395 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.850361 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:28.850810 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.850778 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7","openshift-multus/multus-9vwgn","openshift-network-diagnostics/network-check-target-cnjcx","openshift-network-operator/iptables-alerter-xtssb","kube-system/konnectivity-agent-g5vd4","openshift-cluster-node-tuning-operator/tuned-4bxgg","openshift-dns/node-resolver-5fmr6","openshift-image-registry/node-ca-qx6tc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal","openshift-multus/multus-additional-cni-plugins-gcwlm","openshift-multus/network-metrics-daemon-jcztz","openshift-ovn-kubernetes/ovnkube-node-csxqq","kube-system/global-pull-secret-syncer-pvftk"] Apr 24 21:27:28.852127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.852098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.853362 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.853339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.854395 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.854372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.854647 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.854625 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.854744 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.854647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z62gp\"" Apr 24 21:27:28.855064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.855048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.855620 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.855570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:28.855725 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.855676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:28.855783 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.855716 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:28.856315 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.856298 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:28.856846 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.856670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:28.857060 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857042 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.857417 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857397 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:28.857527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q8d98\"" Apr 24 21:27:28.857648 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.857745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857728 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.857938 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.857922 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.858433 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858414 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:28.858521 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858440 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.858521 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858470 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pdptk\"" Apr 24 21:27:28.858632 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858545 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:28.858802 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858768 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.858899 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.858867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.859695 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.859669 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.860373 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.860044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.861242 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861222 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.861369 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwdgz\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861714 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861742 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861769 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861770 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:28.862079 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.861937 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-z47zn\"" Apr 24 21:27:28.862502 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.862197 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.862589 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.862571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.862685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.862632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.862748 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.862724 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:28.862802 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.862784 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285zj\"" Apr 24 21:27:28.862970 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.862954 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.863338 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.863314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2vjjw\"" Apr 24 21:27:28.864707 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.864686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.865119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.865005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:28.865355 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.865333 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:28.865441 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.865399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:28.865519 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.865500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wv2wf\"" Apr 24 21:27:28.866338 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.866321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:28.866428 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.866407 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:28.867561 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.867512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:28.868527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.868344 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:28.868527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.868444 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6n7r\"" Apr 24 21:27:28.873228 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbx86\" (UniqueName: \"kubernetes.io/projected/54bfba7f-bc92-446e-9646-877d96783afd-kube-api-access-zbx86\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.873325 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.873387 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-script-lib\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.873387 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.873387 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-socket-dir-parent\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-etc-kubernetes\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsknp\" (UniqueName: \"kubernetes.io/projected/b0b45556-212a-460b-a5ae-108beeb6197d-kube-api-access-qsknp\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovn-node-metrics-cert\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-sys\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.873517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-system-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cni-binary-copy\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-conf-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-systemd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/154e8a35-de7d-4d32-a077-f455b275faf2-hosts-file\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv578\" (UniqueName: \"kubernetes.io/projected/02092214-a2c7-40c0-8e80-688f20002a35-kube-api-access-pv578\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-sys-fs\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-bin\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.873766 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67dbaaba-431e-4e09-9019-650f32d8999d-agent-certs\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysconfig\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-tmp\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-bin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02092214-a2c7-40c0-8e80-688f20002a35-iptables-alerter-script\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-registration-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.873982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-netns\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-log-socket\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06abc9ab-6358-4dae-add4-0d288195411f-host\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154e8a35-de7d-4d32-a077-f455b275faf2-tmp-dir\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwm45\" (UniqueName: \"kubernetes.io/projected/154e8a35-de7d-4d32-a077-f455b275faf2-kube-api-access-gwm45\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.874124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-k8s-cni-cncf-io\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-socket-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-etc-selinux\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-os-release\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-node-log\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67dbaaba-431e-4e09-9019-650f32d8999d-konnectivity-ca\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-run\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-slash\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-host\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-etc-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874466 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-tuned\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45dj\" (UniqueName: \"kubernetes.io/projected/82fe3a05-41ab-423c-aab1-343f07ea6c35-kube-api-access-b45dj\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cnibin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.874614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-kubelet\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-daemon-config\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cnibin\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-env-overrides\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874752 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-conf\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02092214-a2c7-40c0-8e80-688f20002a35-host-slash\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-lib-modules\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-multus-certs\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-var-lib-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.874989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-device-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-systemd-units\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-config\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s965\" (UniqueName: \"kubernetes.io/projected/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-kube-api-access-9s965\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-os-release\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbdz\" (UniqueName: \"kubernetes.io/projected/28a670ce-fdb2-4872-af68-5a9ab19b64cc-kube-api-access-xgbdz\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875319 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-kubelet\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-ovn\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvrp\" (UniqueName: \"kubernetes.io/projected/06abc9ab-6358-4dae-add4-0d288195411f-kube-api-access-lkvrp\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-multus\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6cm\" (UniqueName: \"kubernetes.io/projected/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-kube-api-access-6n6cm\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-netns\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-modprobe-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-kubernetes\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-systemd\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-var-lib-kubelet\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-hostroot\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-netd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.875977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.875608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06abc9ab-6358-4dae-add4-0d288195411f-serviceca\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.906313 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.906279 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:27 +0000 UTC" deadline="2027-11-05 16:31:22.484208654 +0000 UTC" Apr 24 21:27:28.906313 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.906311 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13435h3m53.577900338s" Apr 24 21:27:28.963832 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.963803 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:28.976724 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-system-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cni-binary-copy\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-conf-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-systemd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/154e8a35-de7d-4d32-a077-f455b275faf2-hosts-file\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-system-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv578\" (UniqueName: \"kubernetes.io/projected/02092214-a2c7-40c0-8e80-688f20002a35-kube-api-access-pv578\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.976873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-systemd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-conf-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-sys-fs\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/154e8a35-de7d-4d32-a077-f455b275faf2-hosts-file\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-sys-fs\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.976990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-bin\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67dbaaba-431e-4e09-9019-650f32d8999d-agent-certs\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysconfig\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-bin\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysconfig\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-tmp\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-bin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02092214-a2c7-40c0-8e80-688f20002a35-iptables-alerter-script\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.977311 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-cni-dir\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-registration-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-netns\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-log-socket\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-bin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cni-binary-copy\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06abc9ab-6358-4dae-add4-0d288195411f-host\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977447 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-log-socket\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154e8a35-de7d-4d32-a077-f455b275faf2-tmp-dir\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-registration-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-netns\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06abc9ab-6358-4dae-add4-0d288195411f-host\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm45\" (UniqueName: \"kubernetes.io/projected/154e8a35-de7d-4d32-a077-f455b275faf2-kube-api-access-gwm45\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-k8s-cni-cncf-io\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-socket-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-etc-selinux\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-os-release\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-node-log\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67dbaaba-431e-4e09-9019-650f32d8999d-konnectivity-ca\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-run\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/154e8a35-de7d-4d32-a077-f455b275faf2-tmp-dir\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02092214-a2c7-40c0-8e80-688f20002a35-iptables-alerter-script\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-kubelet-config\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977835 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-socket-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-etc-selinux\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-slash\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-run\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-host\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-k8s-cni-cncf-io\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-node-log\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.977932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-dbus\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:28.978911 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-slash\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-etc-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-host\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-tuned\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b45dj\" (UniqueName: \"kubernetes.io/projected/82fe3a05-41ab-423c-aab1-343f07ea6c35-kube-api-access-b45dj\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978238 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-etc-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cnibin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67dbaaba-431e-4e09-9019-650f32d8999d-konnectivity-ca\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-os-release\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-kubelet\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-daemon-config\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cnibin\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-cnibin\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-env-overrides\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-kubelet\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cnibin\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.979685 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-conf\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02092214-a2c7-40c0-8e80-688f20002a35-host-slash\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-lib-modules\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-multus-certs\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-var-lib-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-device-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-systemd-units\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-config\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-conf\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-sysctl-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s965\" (UniqueName: \"kubernetes.io/projected/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-kube-api-access-9s965\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02092214-a2c7-40c0-8e80-688f20002a35-host-slash\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-var-lib-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-lib-modules\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-systemd-units\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-os-release\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbdz\" (UniqueName: \"kubernetes.io/projected/28a670ce-fdb2-4872-af68-5a9ab19b64cc-kube-api-access-xgbdz\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.980554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-run-multus-certs\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-daemon-config\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54bfba7f-bc92-446e-9646-877d96783afd-device-dir\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-env-overrides\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.978982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-kubelet\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-os-release\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-ovn\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-kubelet\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvrp\" (UniqueName: \"kubernetes.io/projected/06abc9ab-6358-4dae-add4-0d288195411f-kube-api-access-lkvrp\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-ovn\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-multus\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-run-openvswitch\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-host-var-lib-cni-multus\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.981368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6cm\" (UniqueName: \"kubernetes.io/projected/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-kube-api-access-6n6cm\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-netns\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-config\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.979234 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-modprobe-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-netns\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.979327 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.479295373 +0000 UTC m=+3.093565664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-kubernetes\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-systemd\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-modprobe-d\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-var-lib-kubelet\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-kubernetes\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-hostroot\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-systemd\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-var-lib-kubelet\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-hostroot\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-run-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-netd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06abc9ab-6358-4dae-add4-0d288195411f-serviceca\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-cni-netd\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbx86\" (UniqueName: \"kubernetes.io/projected/54bfba7f-bc92-446e-9646-877d96783afd-kube-api-access-zbx86\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-script-lib\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-socket-dir-parent\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-etc-kubernetes\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-etc-kubernetes\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.979971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/28a670ce-fdb2-4872-af68-5a9ab19b64cc-multus-socket-dir-parent\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsknp\" (UniqueName: \"kubernetes.io/projected/b0b45556-212a-460b-a5ae-108beeb6197d-kube-api-access-qsknp\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06abc9ab-6358-4dae-add4-0d288195411f-serviceca\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovn-node-metrics-cert\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.982962 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-sys\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.980156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82fe3a05-41ab-423c-aab1-343f07ea6c35-sys\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.981475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.981608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovnkube-script-lib\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.981809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-etc-tuned\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.981971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82fe3a05-41ab-423c-aab1-343f07ea6c35-tmp\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.982285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.983770 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.982938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67dbaaba-431e-4e09-9019-650f32d8999d-agent-certs\") pod \"konnectivity-agent-g5vd4\" (UID: \"67dbaaba-431e-4e09-9019-650f32d8999d\") " pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:28.986977 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.986945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-ovn-node-metrics-cert\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.994788 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.994376 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:28.994788 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.994399 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:28.994788 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.994412 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.994788 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:28.994477 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.494460217 +0000 UTC m=+3.108730498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.994788 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.994734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm45\" (UniqueName: \"kubernetes.io/projected/154e8a35-de7d-4d32-a077-f455b275faf2-kube-api-access-gwm45\") pod \"node-resolver-5fmr6\" (UID: \"154e8a35-de7d-4d32-a077-f455b275faf2\") " pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:28.995502 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.995474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s965\" (UniqueName: \"kubernetes.io/projected/2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0-kube-api-access-9s965\") pod \"ovnkube-node-csxqq\" (UID: \"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0\") " pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:28.995855 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.995821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45dj\" (UniqueName: \"kubernetes.io/projected/82fe3a05-41ab-423c-aab1-343f07ea6c35-kube-api-access-b45dj\") pod \"tuned-4bxgg\" (UID: \"82fe3a05-41ab-423c-aab1-343f07ea6c35\") " pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:28.996123 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.996102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv578\" (UniqueName: \"kubernetes.io/projected/02092214-a2c7-40c0-8e80-688f20002a35-kube-api-access-pv578\") pod \"iptables-alerter-xtssb\" (UID: \"02092214-a2c7-40c0-8e80-688f20002a35\") " pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:28.996298 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.996274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvrp\" (UniqueName: \"kubernetes.io/projected/06abc9ab-6358-4dae-add4-0d288195411f-kube-api-access-lkvrp\") pod \"node-ca-qx6tc\" (UID: \"06abc9ab-6358-4dae-add4-0d288195411f\") " pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:28.998116 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.998086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsknp\" (UniqueName: \"kubernetes.io/projected/b0b45556-212a-460b-a5ae-108beeb6197d-kube-api-access-qsknp\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:28.998513 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.998495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbx86\" (UniqueName: \"kubernetes.io/projected/54bfba7f-bc92-446e-9646-877d96783afd-kube-api-access-zbx86\") pod \"aws-ebs-csi-driver-node-t8mq7\" (UID: \"54bfba7f-bc92-446e-9646-877d96783afd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:28.998637 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.998614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6cm\" (UniqueName: \"kubernetes.io/projected/8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c-kube-api-access-6n6cm\") pod \"multus-additional-cni-plugins-gcwlm\" (UID: \"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c\") " pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:28.999288 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:28.999269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbdz\" (UniqueName: \"kubernetes.io/projected/28a670ce-fdb2-4872-af68-5a9ab19b64cc-kube-api-access-xgbdz\") pod \"multus-9vwgn\" (UID: \"28a670ce-fdb2-4872-af68-5a9ab19b64cc\") " pod="openshift-multus/multus-9vwgn" Apr 24 21:27:29.080946 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.080915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.081115 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.080955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-kubelet-config\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.081115 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.080977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-dbus\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.081115 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.081020 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.081115 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.081094 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.581073179 +0000 UTC m=+3.195343447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.081115 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.081097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-kubelet-config\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.081434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.081119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/20dab956-3a56-4a27-bcee-9f75822a7970-dbus\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.165575 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.165493 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" Apr 24 21:27:29.167907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.167888 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.173058 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.173032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:29.181748 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.181725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9vwgn" Apr 24 21:27:29.187338 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.187319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xtssb" Apr 24 21:27:29.193961 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.193944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:29.200530 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.200510 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qx6tc" Apr 24 21:27:29.207077 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.207055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5fmr6" Apr 24 21:27:29.213324 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.213297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" Apr 24 21:27:29.218882 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.218859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" Apr 24 21:27:29.485067 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.484998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:29.485209 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.485128 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:29.485209 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.485203 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.485184392 +0000 UTC m=+4.099454662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:29.508897 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.508851 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54bfba7f_bc92_446e_9646_877d96783afd.slice/crio-5f88d384ab2fda7fde02add1874a3cb54b1aedd94eb4c716fc72116ad9758dae WatchSource:0}: Error finding container 5f88d384ab2fda7fde02add1874a3cb54b1aedd94eb4c716fc72116ad9758dae: Status 404 returned error can't find the container with id 5f88d384ab2fda7fde02add1874a3cb54b1aedd94eb4c716fc72116ad9758dae Apr 24 21:27:29.510286 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.510210 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba8ae41_c3c4_46dd_aecb_a8d704c38c1c.slice/crio-a5cadbac516e5f4669e7380ec62dafe0b5193853b4f67648ecdcf91ef2035c58 WatchSource:0}: Error finding container a5cadbac516e5f4669e7380ec62dafe0b5193853b4f67648ecdcf91ef2035c58: Status 404 returned error can't find the container with id a5cadbac516e5f4669e7380ec62dafe0b5193853b4f67648ecdcf91ef2035c58 Apr 24 21:27:29.513730 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.513707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod154e8a35_de7d_4d32_a077_f455b275faf2.slice/crio-7903a9f59d5a82a8dd2e2ad2abe347d1cce3dc3ccd369d2376390be1834f4a48 WatchSource:0}: Error finding container 7903a9f59d5a82a8dd2e2ad2abe347d1cce3dc3ccd369d2376390be1834f4a48: Status 404 returned error can't find the container with id 7903a9f59d5a82a8dd2e2ad2abe347d1cce3dc3ccd369d2376390be1834f4a48 Apr 24 21:27:29.515368 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.515338 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82fe3a05_41ab_423c_aab1_343f07ea6c35.slice/crio-4a0da7619e9ffbcaeefb42d3b738845e6f04507c78b54257350b6957ec53655f WatchSource:0}: Error finding container 4a0da7619e9ffbcaeefb42d3b738845e6f04507c78b54257350b6957ec53655f: Status 404 returned error can't find the container with id 4a0da7619e9ffbcaeefb42d3b738845e6f04507c78b54257350b6957ec53655f Apr 24 21:27:29.516172 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.516033 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a670ce_fdb2_4872_af68_5a9ab19b64cc.slice/crio-9b6c5814cd7cd0117598c0d3172f93fca2c39f41cadf66e754a1528eb56a34ed WatchSource:0}: Error finding container 9b6c5814cd7cd0117598c0d3172f93fca2c39f41cadf66e754a1528eb56a34ed: Status 404 returned error can't find the container with id 9b6c5814cd7cd0117598c0d3172f93fca2c39f41cadf66e754a1528eb56a34ed Apr 24 21:27:29.518508 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.518193 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2054eaf6_5d96_49ee_86ed_e32bdb5b9ea0.slice/crio-bfda35b646a18b6a8ab9e59be3bde3bc76cf19a18ff4df005f632658a8c3d7ce WatchSource:0}: Error finding container bfda35b646a18b6a8ab9e59be3bde3bc76cf19a18ff4df005f632658a8c3d7ce: Status 404 returned error can't find the container with id bfda35b646a18b6a8ab9e59be3bde3bc76cf19a18ff4df005f632658a8c3d7ce Apr 24 21:27:29.518755 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.518715 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06abc9ab_6358_4dae_add4_0d288195411f.slice/crio-cfe028125d3375a6eb3f01ab59cf3f23c1ad34e2bba3ad60754fbb615c048785 WatchSource:0}: Error finding container cfe028125d3375a6eb3f01ab59cf3f23c1ad34e2bba3ad60754fbb615c048785: Status 404 returned error can't find the container with id cfe028125d3375a6eb3f01ab59cf3f23c1ad34e2bba3ad60754fbb615c048785 Apr 24 21:27:29.519849 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.519393 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67dbaaba_431e_4e09_9019_650f32d8999d.slice/crio-0acb663f705b4e8449a572c6c14297c4bfe6f5c407aeb4212b277a088f09e2f8 WatchSource:0}: Error finding container 0acb663f705b4e8449a572c6c14297c4bfe6f5c407aeb4212b277a088f09e2f8: Status 404 returned error can't find the container with id 0acb663f705b4e8449a572c6c14297c4bfe6f5c407aeb4212b277a088f09e2f8 Apr 24 21:27:29.521221 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:27:29.520935 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02092214_a2c7_40c0_8e80_688f20002a35.slice/crio-3fee8eb0ee8797977470e9fd498d64c3276026ca097db0b44672c478181fa01e WatchSource:0}: Error finding container 3fee8eb0ee8797977470e9fd498d64c3276026ca097db0b44672c478181fa01e: Status 404 returned error can't find the container with id 3fee8eb0ee8797977470e9fd498d64c3276026ca097db0b44672c478181fa01e Apr 24 21:27:29.586187 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.586011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:29.586340 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.586220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:29.586340 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586163 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.586340 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586323 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.586306199 +0000 UTC m=+4.200576469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.586510 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586392 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:29.586510 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586409 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:29.586510 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586420 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.586510 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.586472 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.58645621 +0000 UTC m=+4.200726481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.907285 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.907195 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:27 +0000 UTC" deadline="2027-12-03 12:43:00.574875982 +0000 UTC" Apr 24 21:27:29.907285 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.907236 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14103h15m30.66764387s" Apr 24 21:27:29.970132 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.969476 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:29.970132 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:29.969650 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:29.981930 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.981198 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" event={"ID":"46c159a6e67f71d6ddbcaa845877ef38","Type":"ContainerStarted","Data":"9f00058975b0188b92e3da98fecd984e5c4f4efb861da742a7bba89ad8ab2209"} Apr 24 21:27:29.983898 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.983823 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9vwgn" event={"ID":"28a670ce-fdb2-4872-af68-5a9ab19b64cc","Type":"ContainerStarted","Data":"9b6c5814cd7cd0117598c0d3172f93fca2c39f41cadf66e754a1528eb56a34ed"} Apr 24 21:27:29.988090 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.988008 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"bfda35b646a18b6a8ab9e59be3bde3bc76cf19a18ff4df005f632658a8c3d7ce"} Apr 24 21:27:29.994954 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.994670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" event={"ID":"82fe3a05-41ab-423c-aab1-343f07ea6c35","Type":"ContainerStarted","Data":"4a0da7619e9ffbcaeefb42d3b738845e6f04507c78b54257350b6957ec53655f"} Apr 24 21:27:29.999625 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:29.999592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerStarted","Data":"a5cadbac516e5f4669e7380ec62dafe0b5193853b4f67648ecdcf91ef2035c58"} Apr 24 21:27:30.004410 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.003894 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xtssb" event={"ID":"02092214-a2c7-40c0-8e80-688f20002a35","Type":"ContainerStarted","Data":"3fee8eb0ee8797977470e9fd498d64c3276026ca097db0b44672c478181fa01e"} Apr 24 21:27:30.005684 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.005657 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:30.015628 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.010501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g5vd4" event={"ID":"67dbaaba-431e-4e09-9019-650f32d8999d","Type":"ContainerStarted","Data":"0acb663f705b4e8449a572c6c14297c4bfe6f5c407aeb4212b277a088f09e2f8"} Apr 24 21:27:30.015628 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.012959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qx6tc" event={"ID":"06abc9ab-6358-4dae-add4-0d288195411f","Type":"ContainerStarted","Data":"cfe028125d3375a6eb3f01ab59cf3f23c1ad34e2bba3ad60754fbb615c048785"} Apr 24 21:27:30.016220 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.016194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5fmr6" event={"ID":"154e8a35-de7d-4d32-a077-f455b275faf2","Type":"ContainerStarted","Data":"7903a9f59d5a82a8dd2e2ad2abe347d1cce3dc3ccd369d2376390be1834f4a48"} Apr 24 21:27:30.021433 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.021404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" event={"ID":"54bfba7f-bc92-446e-9646-877d96783afd","Type":"ContainerStarted","Data":"5f88d384ab2fda7fde02add1874a3cb54b1aedd94eb4c716fc72116ad9758dae"} Apr 24 21:27:30.495065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.494896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:30.495362 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.495225 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.495362 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.495318 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.495298838 +0000 UTC m=+6.109569101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.596272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.596108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:30.596272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.596167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:30.596481 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596353 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:30.596481 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596371 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:30.596481 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596384 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.596481 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596444 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.596424361 +0000 UTC m=+6.210694622 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.596865 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596848 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:30.596924 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.596902 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.59688707 +0000 UTC m=+6.211157328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:30.967716 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.966844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:30.967716 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.966994 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:30.967716 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:30.967440 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:30.967716 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:30.967665 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:31.034681 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:31.034495 2577 generic.go:358] "Generic (PLEG): container finished" podID="5803b248a78a545dbea4248274c87b99" containerID="6e60c504138e2c5001b2f295cdde7a152991f66f9e0281ea71485e01fdaac627" exitCode=0 Apr 24 21:27:31.035349 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:31.035312 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" event={"ID":"5803b248a78a545dbea4248274c87b99","Type":"ContainerDied","Data":"6e60c504138e2c5001b2f295cdde7a152991f66f9e0281ea71485e01fdaac627"} Apr 24 21:27:31.054278 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:31.053401 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-36.ec2.internal" podStartSLOduration=3.053381269 podStartE2EDuration="3.053381269s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:29.998037995 +0000 UTC m=+3.612308276" watchObservedRunningTime="2026-04-24 21:27:31.053381269 +0000 UTC m=+4.667651549" Apr 24 21:27:31.966772 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:31.966613 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:31.966772 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:31.966745 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:32.055952 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.055913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" event={"ID":"5803b248a78a545dbea4248274c87b99","Type":"ContainerStarted","Data":"cd2938b75d1b2d1bc64ab68d9e7537adc81cfd62b1567ada34665d65a6348bc2"} Apr 24 21:27:32.072197 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.072144 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-36.ec2.internal" podStartSLOduration=4.072123997 podStartE2EDuration="4.072123997s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:32.071668059 +0000 UTC m=+5.685938340" watchObservedRunningTime="2026-04-24 21:27:32.072123997 +0000 UTC m=+5.686394278" Apr 24 21:27:32.513477 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.512836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:32.513477 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.513056 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:32.513477 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.513114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.513096152 +0000 UTC m=+10.127366424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:32.613307 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.613229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:32.613307 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.613312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613384 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613464 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613474 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.613438284 +0000 UTC m=+10.227708555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613482 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613494 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:32.613592 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.613538 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.613523319 +0000 UTC m=+10.227793582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:32.966922 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.966886 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:32.967096 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.967025 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:32.967096 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:32.967076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:32.967225 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:32.967188 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:33.966661 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:33.966618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:33.967138 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:33.966758 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:34.966951 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:34.966922 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:34.967426 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:34.966929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:34.967426 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:34.967047 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:34.967426 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:34.967141 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:35.967098 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:35.967064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:35.967554 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:35.967199 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:36.549070 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:36.549027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:36.549287 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.549191 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.549287 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.549270 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.549237858 +0000 UTC m=+18.163508116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.649980 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:36.649938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:36.650155 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:36.650003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:36.650155 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650100 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:36.650155 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650136 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:36.650155 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650154 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:36.650372 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650167 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:36.650372 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.650159666 +0000 UTC m=+18.264429928 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:36.650372 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.650218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.650202014 +0000 UTC m=+18.264472281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:36.966649 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:36.966611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:36.966832 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.966755 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:36.968150 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:36.968002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:36.968150 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:36.968101 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:37.966474 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:37.966432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:37.966729 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:37.966570 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:38.966734 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:38.966689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:38.967192 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:38.966697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:38.967192 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:38.966830 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:38.967192 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:38.966880 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:39.967435 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:39.967404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:39.967845 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:39.967525 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:40.966441 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:40.966408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:40.966635 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:40.966415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:40.966635 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:40.966573 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:40.966760 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:40.966636 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:41.966531 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:41.966397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:41.966878 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:41.966529 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:42.966476 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:42.966437 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:42.966660 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:42.966450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:42.966660 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:42.966579 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:42.967069 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:42.966679 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:43.966725 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:43.966684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:43.967188 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:43.966815 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:44.610356 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:44.610300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:44.610512 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.610481 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.610580 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.610564 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.610544664 +0000 UTC m=+34.224814926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.711353 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:44.711302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:44.711527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:44.711420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:44.711527 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711501 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:44.711648 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711528 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:44.711648 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711532 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.711648 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711542 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.711648 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711599 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.711582286 +0000 UTC m=+34.325852544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.711648 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.711614 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.711609061 +0000 UTC m=+34.325879317 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.966591 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:44.966504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:44.966750 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:44.966513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:44.966750 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.966627 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:44.966750 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:44.966698 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:45.966708 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:45.966670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:45.966875 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:45.966785 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:46.970916 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:46.970508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:46.971616 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:46.970527 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:46.971616 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:46.971070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:46.971616 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:46.971167 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:47.081929 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.081899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g5vd4" event={"ID":"67dbaaba-431e-4e09-9019-650f32d8999d","Type":"ContainerStarted","Data":"17159bda3cb6c667440f62b9cde6b40b5a71127e88cbade55b128cdb3fab0d1a"} Apr 24 21:27:47.083052 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.083027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qx6tc" event={"ID":"06abc9ab-6358-4dae-add4-0d288195411f","Type":"ContainerStarted","Data":"a4f085d5004a708a4ffdaabe3482bb11b78633ad05eac01cc4dd1994523c91b3"} Apr 24 21:27:47.084148 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.084129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5fmr6" event={"ID":"154e8a35-de7d-4d32-a077-f455b275faf2","Type":"ContainerStarted","Data":"7b82c1cbdf8d662e65f1585df3502375e85c6cda31c89b538fff1956bb8d319f"} Apr 24 21:27:47.085241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.085223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" event={"ID":"54bfba7f-bc92-446e-9646-877d96783afd","Type":"ContainerStarted","Data":"d275f9b10158f0fdc8dc20ff4a1590b8aeff898f21b2819c3e8a8278f4aed712"} Apr 24 21:27:47.086384 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.086363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9vwgn" event={"ID":"28a670ce-fdb2-4872-af68-5a9ab19b64cc","Type":"ContainerStarted","Data":"a8c950dad9f79878f70f3dbffc113d1a4186d6c58fb891b74a6a9b65b4f2b55c"} Apr 24 21:27:47.088095 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:27:47.088367 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088348 2577 generic.go:358] "Generic (PLEG): container finished" podID="2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0" containerID="a20c6e65532dd9b8458541fef4dc33166aa0a4062efef4437027d8d729c1519d" exitCode=1 Apr 24 21:27:47.088431 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"0e7e3697f953091b2dd88760b58513cc9e2d5383aa8b6bbb0a25b06507c44772"} Apr 24 21:27:47.088483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"44d1a4284aab84ffd170e8a4986d7c47871acc2a919938090d42753c5aa92a4d"} Apr 24 21:27:47.088483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerDied","Data":"a20c6e65532dd9b8458541fef4dc33166aa0a4062efef4437027d8d729c1519d"} Apr 24 21:27:47.088483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.088464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"88cba500653fedad8ce1a1a7d709a534fb43122746cd4e98b802b3b2ea50f9cd"} Apr 24 21:27:47.089538 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.089517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" event={"ID":"82fe3a05-41ab-423c-aab1-343f07ea6c35","Type":"ContainerStarted","Data":"9114eee02d198c76f0cbf46a58e346586d01ef3a021df1901ea8ab15bb779c76"} Apr 24 21:27:47.090694 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.090675 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="16bd703fbb9ba15d288af90a578fdd17f3e88065ae93528a6d516a124b591e84" exitCode=0 Apr 24 21:27:47.090761 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.090703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"16bd703fbb9ba15d288af90a578fdd17f3e88065ae93528a6d516a124b591e84"} Apr 24 21:27:47.120546 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.120493 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g5vd4" podStartSLOduration=3.360629404 podStartE2EDuration="20.120477734s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.521635579 +0000 UTC m=+3.135905852" lastFinishedPulling="2026-04-24 21:27:46.281483913 +0000 UTC m=+19.895754182" observedRunningTime="2026-04-24 21:27:47.103558216 +0000 UTC m=+20.717828496" watchObservedRunningTime="2026-04-24 21:27:47.120477734 +0000 UTC m=+20.734748014" Apr 24 21:27:47.120662 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.120641 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5fmr6" podStartSLOduration=3.356726321 podStartE2EDuration="20.120635897s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.517599694 +0000 UTC m=+3.131869967" lastFinishedPulling="2026-04-24 21:27:46.281509272 +0000 UTC m=+19.895779543" observedRunningTime="2026-04-24 21:27:47.120452273 +0000 UTC m=+20.734722555" watchObservedRunningTime="2026-04-24 21:27:47.120635897 +0000 UTC m=+20.734906176" Apr 24 21:27:47.158561 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.158505 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9vwgn" podStartSLOduration=3.355618572 podStartE2EDuration="20.158486558s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.518875182 +0000 UTC m=+3.133145454" lastFinishedPulling="2026-04-24 21:27:46.32174317 +0000 UTC m=+19.936013440" observedRunningTime="2026-04-24 21:27:47.157905795 +0000 UTC m=+20.772176067" watchObservedRunningTime="2026-04-24 21:27:47.158486558 +0000 UTC m=+20.772756840" Apr 24 21:27:47.171146 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.171091 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qx6tc" podStartSLOduration=3.410274726 podStartE2EDuration="20.171075453s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.520767209 +0000 UTC m=+3.135037470" lastFinishedPulling="2026-04-24 21:27:46.281567933 +0000 UTC m=+19.895838197" observedRunningTime="2026-04-24 21:27:47.170963782 +0000 UTC m=+20.785234062" watchObservedRunningTime="2026-04-24 21:27:47.171075453 +0000 UTC m=+20.785345735" Apr 24 21:27:47.206948 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.206885 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4bxgg" podStartSLOduration=3.4421948 podStartE2EDuration="20.206864177s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.51866439 +0000 UTC m=+3.132934654" lastFinishedPulling="2026-04-24 21:27:46.283333759 +0000 UTC m=+19.897604031" observedRunningTime="2026-04-24 21:27:47.206862993 +0000 UTC m=+20.821133273" watchObservedRunningTime="2026-04-24 21:27:47.206864177 +0000 UTC m=+20.821134456" Apr 24 21:27:47.775877 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.775846 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:47.939523 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.939406 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:47.775869111Z","UUID":"a6a49029-9e9d-424e-a43d-0b4496c33832","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:47.942117 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.942086 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:47.942117 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.942125 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:47.966610 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:47.966580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:47.966768 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:47.966685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:48.095523 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.095491 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:27:48.096023 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.095884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"8cfa1d1faa4784b61264d7de9bd7f1d0a1f625d4716aef45643a59a7f715cde5"} Apr 24 21:27:48.096023 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.095925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"7912aaf2b82559b0e999fcccb9d11f0d525364fb14c1abcc70752fd4fb61ff88"} Apr 24 21:27:48.097160 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.097122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xtssb" event={"ID":"02092214-a2c7-40c0-8e80-688f20002a35","Type":"ContainerStarted","Data":"baa26a64ddc70a77033f99e1b7379b21a3e827814c7c020ff35090226673c350"} Apr 24 21:27:48.098762 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.098726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" event={"ID":"54bfba7f-bc92-446e-9646-877d96783afd","Type":"ContainerStarted","Data":"9ca4e458734031de2377f0d7889c2bdf194c2ba967c06bc32c30a6dd1ea02575"} Apr 24 21:27:48.111073 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.111017 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xtssb" podStartSLOduration=4.35126076 podStartE2EDuration="21.11100128s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.522222596 +0000 UTC m=+3.136492855" lastFinishedPulling="2026-04-24 21:27:46.281963104 +0000 UTC m=+19.896233375" observedRunningTime="2026-04-24 21:27:48.109844421 +0000 UTC m=+21.724114701" watchObservedRunningTime="2026-04-24 21:27:48.11100128 +0000 UTC m=+21.725271559" Apr 24 21:27:48.970265 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.970217 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:48.970265 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:48.970238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:48.970514 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:48.970379 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:48.970732 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:48.970706 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:49.102199 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:49.102154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" event={"ID":"54bfba7f-bc92-446e-9646-877d96783afd","Type":"ContainerStarted","Data":"bbc4cea297d34d5d21f9b1c415f65115f1499608eb238e9e6842fad61faec91c"} Apr 24 21:27:49.118981 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:49.118931 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t8mq7" podStartSLOduration=2.741408041 podStartE2EDuration="22.11891462s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.511003063 +0000 UTC m=+3.125273334" lastFinishedPulling="2026-04-24 21:27:48.88850964 +0000 UTC m=+22.502779913" observedRunningTime="2026-04-24 21:27:49.118868702 +0000 UTC m=+22.733138993" watchObservedRunningTime="2026-04-24 21:27:49.11891462 +0000 UTC m=+22.733184902" Apr 24 21:27:49.967566 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:49.967364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:49.967772 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:49.967653 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:50.107722 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.107688 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:27:50.108126 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.108079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"90cbfa34cb21ade51a9c4be3dfc838a94bdf7f2b4a4b369506738f0995e9132b"} Apr 24 21:27:50.117480 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.117454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:50.118069 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.118048 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:50.967323 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.967295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:50.967498 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:50.967328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:50.967498 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:50.967426 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:50.967610 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:50.967581 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:51.110294 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:51.110258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:51.110816 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:51.110754 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g5vd4" Apr 24 21:27:51.967112 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:51.966892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:51.967242 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:51.967112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:52.114905 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.114872 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="5a2b2b493870963c0a21bbdf0cecc6588c83743e65843283a6b47c5d88e9aeea" exitCode=0 Apr 24 21:27:52.115620 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.114959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"5a2b2b493870963c0a21bbdf0cecc6588c83743e65843283a6b47c5d88e9aeea"} Apr 24 21:27:52.118158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.118139 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:27:52.118506 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.118482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"62100e010badc070a9edbf2070258a0df6560a5a482ef33938a2f0eceedc05ef"} Apr 24 21:27:52.118775 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.118756 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:52.118854 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.118785 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:52.118950 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.118935 2577 scope.go:117] "RemoveContainer" containerID="a20c6e65532dd9b8458541fef4dc33166aa0a4062efef4437027d8d729c1519d" Apr 24 21:27:52.134065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.134047 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:52.969111 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.969029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:52.969271 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:52.969153 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:52.969271 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:52.969264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:52.969374 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:52.969348 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:53.123755 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.123733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:27:53.124196 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.124068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" event={"ID":"2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0","Type":"ContainerStarted","Data":"6f7ee00956b44b4071102b0dbf81e5237594aaa320aef8530688f67f48102268"} Apr 24 21:27:53.124490 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.124464 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:53.125983 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.125960 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="8f25ee73c9a1ae11a7bc95dd005eb19213fbc3a6b8c2dd18c3240f2fa885b657" exitCode=0 Apr 24 21:27:53.126106 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.126027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"8f25ee73c9a1ae11a7bc95dd005eb19213fbc3a6b8c2dd18c3240f2fa885b657"} Apr 24 21:27:53.138879 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.138860 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:27:53.200880 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.200830 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" podStartSLOduration=9.15306827 podStartE2EDuration="26.20081418s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.520417694 +0000 UTC m=+3.134687965" lastFinishedPulling="2026-04-24 21:27:46.568163604 +0000 UTC m=+20.182433875" observedRunningTime="2026-04-24 21:27:53.169984433 +0000 UTC m=+26.784254713" watchObservedRunningTime="2026-04-24 21:27:53.20081418 +0000 UTC m=+26.815084478" Apr 24 21:27:53.632312 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.632282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pvftk"] Apr 24 21:27:53.632472 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.632415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:53.632549 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:53.632526 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:53.634943 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.634913 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cnjcx"] Apr 24 21:27:53.635117 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.635025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:53.635178 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:53.635117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:53.640706 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.640681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jcztz"] Apr 24 21:27:53.640823 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:53.640788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:53.640913 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:53.640892 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:54.129674 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:54.129642 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="28e4ad16705bd043235a8e8e49b4a3d3a23bfbac3ceaf5a6932e55af71326943" exitCode=0 Apr 24 21:27:54.130054 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:54.129732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"28e4ad16705bd043235a8e8e49b4a3d3a23bfbac3ceaf5a6932e55af71326943"} Apr 24 21:27:54.967197 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:54.967165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:54.967406 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:54.967176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:54.967406 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:54.967317 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:54.967406 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:54.967376 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:55.967003 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:55.966970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:55.967424 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:55.967093 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:56.967914 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:56.967700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:56.968423 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:56.967761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:56.968423 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:56.967996 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:56.968423 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:56.968129 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:57.967308 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:57.967273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:57.967518 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:57.967413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:27:58.966419 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:58.966385 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:27:58.966973 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:58.966513 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cnjcx" podUID="25d36037-41e8-4ba4-9072-939c3c9e4e19" Apr 24 21:27:58.966973 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:58.966556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:27:58.966973 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:58.966644 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:27:59.966889 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:27:59.966853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:27:59.967366 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:27:59.966957 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pvftk" podUID="20dab956-3a56-4a27-bcee-9f75822a7970" Apr 24 21:28:00.624361 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.624328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:28:00.624520 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.624434 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:00.624520 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.624485 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.624470686 +0000 UTC m=+66.238740943 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:00.694611 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.694582 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-36.ec2.internal" event="NodeReady" Apr 24 21:28:00.694751 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.694705 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:00.724803 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.724766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:28:00.724803 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.724815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.724918 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.724939 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.724954 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.724963 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cjxt4 for pod openshift-network-diagnostics/network-check-target-cnjcx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.724986 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret podName:20dab956-3a56-4a27-bcee-9f75822a7970 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.724967223 +0000 UTC m=+66.339237486 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret") pod "global-pull-secret-syncer-pvftk" (UID: "20dab956-3a56-4a27-bcee-9f75822a7970") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:00.725046 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:00.725005 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4 podName:25d36037-41e8-4ba4-9072-939c3c9e4e19 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.724996136 +0000 UTC m=+66.339266400 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cjxt4" (UniqueName: "kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4") pod "network-check-target-cnjcx" (UID: "25d36037-41e8-4ba4-9072-939c3c9e4e19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:00.743792 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.743766 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ttnhg"] Apr 24 21:28:00.775071 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.775043 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zhh6t"] Apr 24 21:28:00.775202 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.775188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:00.778803 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.778782 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:00.779203 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.779186 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:28:00.779324 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.779309 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:00.790680 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.790657 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ttnhg"] Apr 24 21:28:00.790680 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.790683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zhh6t"] Apr 24 21:28:00.790823 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.790771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:00.793418 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.793397 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:00.793489 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.793425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:28:00.793489 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.793428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:00.793489 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.793467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:00.925943 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.925855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:00.925943 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.925896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnhk\" (UniqueName: \"kubernetes.io/projected/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-kube-api-access-8rnhk\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:00.925943 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.925928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbvc\" (UniqueName: \"kubernetes.io/projected/def556d7-437a-4b70-b31e-6643ed89bc7e-kube-api-access-zxbvc\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:00.926158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.925957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-tmp-dir\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:00.926158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.925979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:00.926158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.926062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-config-volume\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:00.967285 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.967234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:00.967848 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.967434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:28:00.970799 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.970778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:00.970935 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.970881 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:28:00.970935 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.970912 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:00.971045 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.970950 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:00.971177 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:00.971163 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sctkp\"" Apr 24 21:28:01.027324 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:01.027461 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-config-volume\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.027461 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.027461 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnhk\" (UniqueName: \"kubernetes.io/projected/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-kube-api-access-8rnhk\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.027461 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.027449 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:01.027640 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbvc\" (UniqueName: \"kubernetes.io/projected/def556d7-437a-4b70-b31e-6643ed89bc7e-kube-api-access-zxbvc\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:01.027640 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-tmp-dir\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.027640 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.027519 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:01.027640 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.027543 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.527524681 +0000 UTC m=+35.141794945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:01.027640 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.027584 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.527568762 +0000 UTC m=+35.141839019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:01.027918 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.027871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-tmp-dir\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.040306 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.040286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-config-volume\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.045507 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.045484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnhk\" (UniqueName: \"kubernetes.io/projected/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-kube-api-access-8rnhk\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.045732 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.045713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbvc\" (UniqueName: \"kubernetes.io/projected/def556d7-437a-4b70-b31e-6643ed89bc7e-kube-api-access-zxbvc\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:01.146614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.146575 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="6f94f9d91eaae39d307a2f596771c98632f9fbdc08520fbb68853e9c7ec50c44" exitCode=0 Apr 24 21:28:01.146807 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.146628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"6f94f9d91eaae39d307a2f596771c98632f9fbdc08520fbb68853e9c7ec50c44"} Apr 24 21:28:01.531949 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.531854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:01.531949 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.531901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:01.532171 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.532010 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:01.532171 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.532016 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:01.532171 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.532060 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.532046276 +0000 UTC m=+36.146316533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:01.532171 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:01.532072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.532066837 +0000 UTC m=+36.146337094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:01.967399 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.967366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:28:01.970314 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:01.970289 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:02.151122 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:02.151093 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c" containerID="a3c368453a764927d190ddca81e93b4f6c66fe26c48430a28fd8896a158129a8" exitCode=0 Apr 24 21:28:02.151293 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:02.151140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerDied","Data":"a3c368453a764927d190ddca81e93b4f6c66fe26c48430a28fd8896a158129a8"} Apr 24 21:28:02.539854 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:02.539757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:02.539854 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:02.539804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:02.540076 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:02.539906 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:02.540076 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:02.539921 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:02.540076 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:02.539958 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.539945111 +0000 UTC m=+38.154215369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:02.540076 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:02.539987 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.539968358 +0000 UTC m=+38.154238616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:03.156065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:03.156028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" event={"ID":"8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c","Type":"ContainerStarted","Data":"94f452c0055976fc7335e0dbb437ac0e1e3d7db35619ce5bac7737ecd792c4c0"} Apr 24 21:28:04.557010 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:04.556969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:04.557010 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:04.557016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:04.557493 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:04.557131 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:04.557493 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:04.557147 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:04.557493 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:04.557181 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.557168381 +0000 UTC m=+42.171438638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:04.557493 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:04.557218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.557200024 +0000 UTC m=+42.171470295 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:08.587622 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:08.587579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:08.588124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:08.587732 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:08.588124 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:08.587768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:08.588124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:08.587792 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.587775427 +0000 UTC m=+50.202045685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:08.588124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:08.587860 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:08.588124 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:08.587922 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.587905622 +0000 UTC m=+50.202175896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:16.646722 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:16.646681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:16.646722 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:16.646730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:16.647216 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:16.646827 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:16.647216 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:16.646828 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:16.647216 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:16.646888 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.646872909 +0000 UTC m=+66.261143166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:16.647216 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:16.646901 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.646894691 +0000 UTC m=+66.261164947 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:25.142215 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:25.142186 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-csxqq" Apr 24 21:28:25.190202 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:25.190139 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gcwlm" podStartSLOduration=27.660673864 podStartE2EDuration="58.190092919s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:27:29.512372626 +0000 UTC m=+3.126642897" lastFinishedPulling="2026-04-24 21:28:00.041791696 +0000 UTC m=+33.656061952" observedRunningTime="2026-04-24 21:28:03.18858189 +0000 UTC m=+36.802852186" watchObservedRunningTime="2026-04-24 21:28:25.190092919 +0000 UTC m=+58.804363230" Apr 24 21:28:32.653464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.653418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:28:32.653464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.653466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:28:32.654060 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.653484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:28:32.654060 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.653574 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:32.654060 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.653624 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:32.654060 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.653663 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.653642785 +0000 UTC m=+98.267913061 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:28:32.654060 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.653680 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.653673776 +0000 UTC m=+98.267944034 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:28:32.656507 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.656485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:32.663821 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.663797 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:32.663918 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:28:32.663864 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:36.663842815 +0000 UTC m=+130.278113085 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : secret "metrics-daemon-secret" not found Apr 24 21:28:32.754468 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.754428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:28:32.754468 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.754470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:32.758132 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.758112 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:32.758203 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.758135 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:32.768115 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.768088 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:32.768533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.768504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/20dab956-3a56-4a27-bcee-9f75822a7970-original-pull-secret\") pod \"global-pull-secret-syncer-pvftk\" (UID: \"20dab956-3a56-4a27-bcee-9f75822a7970\") " pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:28:32.778479 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.778455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxt4\" (UniqueName: \"kubernetes.io/projected/25d36037-41e8-4ba4-9072-939c3c9e4e19-kube-api-access-cjxt4\") pod \"network-check-target-cnjcx\" (UID: \"25d36037-41e8-4ba4-9072-939c3c9e4e19\") " pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:32.875989 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:32.875952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pvftk" Apr 24 21:28:33.001998 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.001957 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pvftk"] Apr 24 21:28:33.005514 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:28:33.005487 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20dab956_3a56_4a27_bcee_9f75822a7970.slice/crio-c50726ffb1733af260a8ca7282c093dac88451732d46a2b888795043ae05943f WatchSource:0}: Error finding container c50726ffb1733af260a8ca7282c093dac88451732d46a2b888795043ae05943f: Status 404 returned error can't find the container with id c50726ffb1733af260a8ca7282c093dac88451732d46a2b888795043ae05943f Apr 24 21:28:33.079847 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.079818 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sctkp\"" Apr 24 21:28:33.087984 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.087965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:33.203390 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.203244 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cnjcx"] Apr 24 21:28:33.205635 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:28:33.205601 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d36037_41e8_4ba4_9072_939c3c9e4e19.slice/crio-cea8a6639d7dfc6f10cf0d18ae4d1ecbc21835cc2c033857c463e04fff7120af WatchSource:0}: Error finding container cea8a6639d7dfc6f10cf0d18ae4d1ecbc21835cc2c033857c463e04fff7120af: Status 404 returned error can't find the container with id cea8a6639d7dfc6f10cf0d18ae4d1ecbc21835cc2c033857c463e04fff7120af Apr 24 21:28:33.214239 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.214211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cnjcx" event={"ID":"25d36037-41e8-4ba4-9072-939c3c9e4e19","Type":"ContainerStarted","Data":"cea8a6639d7dfc6f10cf0d18ae4d1ecbc21835cc2c033857c463e04fff7120af"} Apr 24 21:28:33.215185 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:33.215162 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pvftk" event={"ID":"20dab956-3a56-4a27-bcee-9f75822a7970","Type":"ContainerStarted","Data":"c50726ffb1733af260a8ca7282c093dac88451732d46a2b888795043ae05943f"} Apr 24 21:28:38.226663 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:38.226624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cnjcx" event={"ID":"25d36037-41e8-4ba4-9072-939c3c9e4e19","Type":"ContainerStarted","Data":"14b61b1f255cfe99d0891be3bb3c80bd1d486a4dcc12086bb2b7bd9ea4ef9b8e"} Apr 24 21:28:38.227222 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:38.226725 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:28:38.227938 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:38.227912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pvftk" event={"ID":"20dab956-3a56-4a27-bcee-9f75822a7970","Type":"ContainerStarted","Data":"647ae79f8c93fb38bf8460d4f1dccb2ede698adfe73a2686f344f89fb06c90e9"} Apr 24 21:28:38.246912 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:28:38.246863 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cnjcx" podStartSLOduration=67.008531172 podStartE2EDuration="1m11.246849715s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.20766451 +0000 UTC m=+66.821934767" lastFinishedPulling="2026-04-24 21:28:37.445983039 +0000 UTC m=+71.060253310" observedRunningTime="2026-04-24 21:28:38.245713516 +0000 UTC m=+71.859983795" watchObservedRunningTime="2026-04-24 21:28:38.246849715 +0000 UTC m=+71.861119997" Apr 24 21:29:04.680023 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:04.679981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:29:04.680464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:04.680050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:29:04.680464 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:04.680150 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:04.680464 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:04.680230 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert podName:def556d7-437a-4b70-b31e-6643ed89bc7e nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.680213151 +0000 UTC m=+162.294483408 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert") pod "ingress-canary-zhh6t" (UID: "def556d7-437a-4b70-b31e-6643ed89bc7e") : secret "canary-serving-cert" not found Apr 24 21:29:04.680464 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:04.680150 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:04.680464 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:04.680323 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls podName:6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.680307642 +0000 UTC m=+162.294577903 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls") pod "dns-default-ttnhg" (UID: "6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37") : secret "dns-default-metrics-tls" not found Apr 24 21:29:09.232062 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:09.232023 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cnjcx" Apr 24 21:29:09.250723 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:09.250675 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pvftk" podStartSLOduration=96.807741953 podStartE2EDuration="1m41.250661007s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.0071686 +0000 UTC m=+66.621438857" lastFinishedPulling="2026-04-24 21:28:37.45008764 +0000 UTC m=+71.064357911" observedRunningTime="2026-04-24 21:28:38.265799029 +0000 UTC m=+71.880069310" watchObservedRunningTime="2026-04-24 21:29:09.250661007 +0000 UTC m=+102.864931285" Apr 24 21:29:35.875804 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.875770 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8"] Apr 24 21:29:35.878718 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.878695 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:35.883902 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.883877 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:35.884043 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.883877 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:35.884043 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.883941 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:29:35.884043 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.883959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wmk6v\"" Apr 24 21:29:35.884179 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.884049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:29:35.892896 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.892874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8"] Apr 24 21:29:35.982037 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.982003 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b4467575d-c7qhh"] Apr 24 21:29:35.984743 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.984727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:35.993710 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.993681 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:29:35.998618 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998596 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:29:35.998742 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998701 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:29:35.998812 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998683 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:29:35.998812 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-mv8ql\"" Apr 24 21:29:35.998812 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:29:35.998812 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.998704 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:29:36.000004 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:35.999983 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b4467575d-c7qhh"] Apr 24 21:29:36.000204 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.000185 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.000303 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.000284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3bee81d-b2d9-4efd-8dd1-045747be92da-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.000365 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.000337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l72n\" (UniqueName: \"kubernetes.io/projected/d3bee81d-b2d9-4efd-8dd1-045747be92da-kube-api-access-9l72n\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.101065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlfs\" (UniqueName: \"kubernetes.io/projected/969ea49e-4e0c-48d5-9e89-dfddef64c993-kube-api-access-4hlfs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.101065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.101360 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.101360 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.101218 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:36.101360 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-stats-auth\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.101360 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-default-certificate\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.101360 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.101331 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:29:36.601307655 +0000 UTC m=+130.215577916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:36.101602 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3bee81d-b2d9-4efd-8dd1-045747be92da-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.101602 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.101602 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.101474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l72n\" (UniqueName: \"kubernetes.io/projected/d3bee81d-b2d9-4efd-8dd1-045747be92da-kube-api-access-9l72n\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.102109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.102091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d3bee81d-b2d9-4efd-8dd1-045747be92da-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.121754 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.121723 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l72n\" (UniqueName: \"kubernetes.io/projected/d3bee81d-b2d9-4efd-8dd1-045747be92da-kube-api-access-9l72n\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.202065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.201968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.202065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.202037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlfs\" (UniqueName: \"kubernetes.io/projected/969ea49e-4e0c-48d5-9e89-dfddef64c993-kube-api-access-4hlfs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.202065 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.202065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.202358 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.202107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-stats-auth\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.202358 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.202133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-default-certificate\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.202358 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.202135 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:36.202358 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.202224 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:36.702203793 +0000 UTC m=+130.316474063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:36.202358 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.202269 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:36.702241432 +0000 UTC m=+130.316511694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:36.204608 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.204579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-stats-auth\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.204718 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.204612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-default-certificate\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.214154 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.214135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlfs\" (UniqueName: \"kubernetes.io/projected/969ea49e-4e0c-48d5-9e89-dfddef64c993-kube-api-access-4hlfs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.604676 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.604643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:36.604843 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.604795 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:36.604882 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.604871 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.604855024 +0000 UTC m=+131.219125285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:36.705342 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.705301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.705520 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.705448 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:36.705520 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.705478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:36.705520 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.705508 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.705494508 +0000 UTC m=+131.319764765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:36.705680 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:36.705542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:29:36.705680 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.705575 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.705558466 +0000 UTC m=+131.319828724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:36.705680 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.705593 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:36.705680 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:36.705616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs podName:b0b45556-212a-460b-a5ae-108beeb6197d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:38.705608551 +0000 UTC m=+252.319878807 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs") pod "network-metrics-daemon-jcztz" (UID: "b0b45556-212a-460b-a5ae-108beeb6197d") : secret "metrics-daemon-secret" not found Apr 24 21:29:37.612932 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:37.612881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:37.613366 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:37.613036 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:37.613366 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:37.613099 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:29:39.613084402 +0000 UTC m=+133.227354666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:37.713262 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:37.713223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:37.713433 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:37.713408 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:37.713507 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:37.713492 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:39.71347895 +0000 UTC m=+133.327749207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:37.713569 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:37.713513 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:39.713507183 +0000 UTC m=+133.327777440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:37.713569 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:37.713412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:38.830991 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.830960 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96"] Apr 24 21:29:38.834130 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.834113 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:38.836740 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.836719 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:29:38.836740 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.836733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:38.837687 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.837671 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-ct726\"" Apr 24 21:29:38.837735 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.837703 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:38.848720 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.848695 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96"] Apr 24 21:29:38.923099 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.923065 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8d4\" (UniqueName: \"kubernetes.io/projected/00ab5f90-95f6-4c68-b97c-a55985c40e09-kube-api-access-4h8d4\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:38.923281 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:38.923121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:39.024554 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.024512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8d4\" (UniqueName: \"kubernetes.io/projected/00ab5f90-95f6-4c68-b97c-a55985c40e09-kube-api-access-4h8d4\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:39.024734 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.024565 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:39.024734 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.024687 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:39.024812 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.024737 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls podName:00ab5f90-95f6-4c68-b97c-a55985c40e09 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:39.524724336 +0000 UTC m=+133.138994598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gjg96" (UID: "00ab5f90-95f6-4c68-b97c-a55985c40e09") : secret "samples-operator-tls" not found Apr 24 21:29:39.035790 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.035763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8d4\" (UniqueName: \"kubernetes.io/projected/00ab5f90-95f6-4c68-b97c-a55985c40e09-kube-api-access-4h8d4\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:39.528262 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.528200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:39.528452 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.528370 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:39.528452 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.528446 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls podName:00ab5f90-95f6-4c68-b97c-a55985c40e09 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:40.528427097 +0000 UTC m=+134.142697358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gjg96" (UID: "00ab5f90-95f6-4c68-b97c-a55985c40e09") : secret "samples-operator-tls" not found Apr 24 21:29:39.629289 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.629235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:39.629400 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.629378 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:39.629467 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.629455 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:29:43.629441116 +0000 UTC m=+137.243711376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:39.730390 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.730355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:39.730533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:39.730444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:39.730533 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.730516 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:39.730611 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.730562 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:43.730549489 +0000 UTC m=+137.344819747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:39.730611 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:39.730582 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:43.730568832 +0000 UTC m=+137.344839088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:40.538517 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.538467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:40.538884 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:40.538621 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:40.538884 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:40.538686 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls podName:00ab5f90-95f6-4c68-b97c-a55985c40e09 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:42.538670895 +0000 UTC m=+136.152941156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gjg96" (UID: "00ab5f90-95f6-4c68-b97c-a55985c40e09") : secret "samples-operator-tls" not found Apr 24 21:29:40.971448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.971417 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-96wlk"] Apr 24 21:29:40.974229 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.974213 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:40.978728 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.978709 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:29:40.979184 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.979165 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:40.979302 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.979285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-q9pls\"" Apr 24 21:29:40.979475 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.979450 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:29:40.980709 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.980690 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:40.988363 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.988330 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-96wlk"] Apr 24 21:29:40.988467 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:40.988375 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:29:41.140629 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.140601 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5fmr6_154e8a35-de7d-4d32-a077-f455b275faf2/dns-node-resolver/0.log" Apr 24 21:29:41.142926 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.142905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-trusted-ca\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.143003 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.142935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-kube-api-access-qrng7\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.143003 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.142973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-config\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.143103 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.143035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-serving-cert\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.243534 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.243452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-serving-cert\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.243673 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.243617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-trusted-ca\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.243673 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.243649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-kube-api-access-qrng7\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.243785 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.243685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-config\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.244239 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.244214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-config\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.244394 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.244376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-trusted-ca\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.245763 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.245743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-serving-cert\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.253688 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.253665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrng7\" (UniqueName: \"kubernetes.io/projected/b14cefbb-8e93-43c4-8a2d-f70afbe6cab4-kube-api-access-qrng7\") pod \"console-operator-9d4b6777b-96wlk\" (UID: \"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4\") " pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.283665 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.283639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:41.406296 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:41.406261 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-96wlk"] Apr 24 21:29:41.409620 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:29:41.409593 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14cefbb_8e93_43c4_8a2d_f70afbe6cab4.slice/crio-c786330d106f1bcd02ddb4d2656fe79d88b129b2036de9897fc1a5cdcfffc822 WatchSource:0}: Error finding container c786330d106f1bcd02ddb4d2656fe79d88b129b2036de9897fc1a5cdcfffc822: Status 404 returned error can't find the container with id c786330d106f1bcd02ddb4d2656fe79d88b129b2036de9897fc1a5cdcfffc822 Apr 24 21:29:42.338783 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:42.338752 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qx6tc_06abc9ab-6358-4dae-add4-0d288195411f/node-ca/0.log" Apr 24 21:29:42.350519 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:42.350482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" event={"ID":"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4","Type":"ContainerStarted","Data":"c786330d106f1bcd02ddb4d2656fe79d88b129b2036de9897fc1a5cdcfffc822"} Apr 24 21:29:42.553523 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:42.553490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:42.553680 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:42.553640 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:42.553724 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:42.553710 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls podName:00ab5f90-95f6-4c68-b97c-a55985c40e09 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:46.55369157 +0000 UTC m=+140.167961827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gjg96" (UID: "00ab5f90-95f6-4c68-b97c-a55985c40e09") : secret "samples-operator-tls" not found Apr 24 21:29:43.353281 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.353233 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/0.log" Apr 24 21:29:43.353739 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.353293 2577 generic.go:358] "Generic (PLEG): container finished" podID="b14cefbb-8e93-43c4-8a2d-f70afbe6cab4" containerID="8dad22d9482cde097f3b993a3d79a4a83a190dad36a0c7e9bacdb52776dfbbdf" exitCode=255 Apr 24 21:29:43.353739 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.353328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" event={"ID":"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4","Type":"ContainerDied","Data":"8dad22d9482cde097f3b993a3d79a4a83a190dad36a0c7e9bacdb52776dfbbdf"} Apr 24 21:29:43.353739 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.353551 2577 scope.go:117] "RemoveContainer" containerID="8dad22d9482cde097f3b993a3d79a4a83a190dad36a0c7e9bacdb52776dfbbdf" Apr 24 21:29:43.664349 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.664234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:43.664500 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:43.664384 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:43.664500 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:43.664450 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.66443455 +0000 UTC m=+145.278704812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:43.765082 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.765050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:43.765197 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:43.765123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:43.765233 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:43.765189 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:43.765289 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:43.765236 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.765222211 +0000 UTC m=+145.379492468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:43.765289 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:43.765269 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.765244629 +0000 UTC m=+145.379514886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:44.356417 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.356390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:29:44.356789 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.356727 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/0.log" Apr 24 21:29:44.356789 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.356757 2577 generic.go:358] "Generic (PLEG): container finished" podID="b14cefbb-8e93-43c4-8a2d-f70afbe6cab4" containerID="364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6" exitCode=255 Apr 24 21:29:44.356860 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.356788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" event={"ID":"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4","Type":"ContainerDied","Data":"364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6"} Apr 24 21:29:44.356860 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.356832 2577 scope.go:117] "RemoveContainer" containerID="8dad22d9482cde097f3b993a3d79a4a83a190dad36a0c7e9bacdb52776dfbbdf" Apr 24 21:29:44.357093 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:44.357073 2577 scope.go:117] "RemoveContainer" containerID="364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6" Apr 24 21:29:44.357301 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:44.357276 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-96wlk_openshift-console-operator(b14cefbb-8e93-43c4-8a2d-f70afbe6cab4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" podUID="b14cefbb-8e93-43c4-8a2d-f70afbe6cab4" Apr 24 21:29:45.359966 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:45.359939 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:29:45.360380 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:45.360286 2577 scope.go:117] "RemoveContainer" containerID="364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6" Apr 24 21:29:45.360465 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:45.360448 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-96wlk_openshift-console-operator(b14cefbb-8e93-43c4-8a2d-f70afbe6cab4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" podUID="b14cefbb-8e93-43c4-8a2d-f70afbe6cab4" Apr 24 21:29:46.586849 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:46.586808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:46.587238 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:46.586962 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:46.587238 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:46.587026 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls podName:00ab5f90-95f6-4c68-b97c-a55985c40e09 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:54.587011051 +0000 UTC m=+148.201281308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gjg96" (UID: "00ab5f90-95f6-4c68-b97c-a55985c40e09") : secret "samples-operator-tls" not found Apr 24 21:29:48.587731 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.587698 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz"] Apr 24 21:29:48.591654 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.591637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" Apr 24 21:29:48.595026 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.595002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-bqvbk\"" Apr 24 21:29:48.595127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.595041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:48.595127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.595062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:29:48.601310 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.601291 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz"] Apr 24 21:29:48.707040 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.706997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vls\" (UniqueName: \"kubernetes.io/projected/18ef204f-6aa6-4107-8f8f-26e4ab42c428-kube-api-access-44vls\") pod \"migrator-74bb7799d9-db2jz\" (UID: \"18ef204f-6aa6-4107-8f8f-26e4ab42c428\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" Apr 24 21:29:48.807570 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.807524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44vls\" (UniqueName: \"kubernetes.io/projected/18ef204f-6aa6-4107-8f8f-26e4ab42c428-kube-api-access-44vls\") pod \"migrator-74bb7799d9-db2jz\" (UID: \"18ef204f-6aa6-4107-8f8f-26e4ab42c428\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" Apr 24 21:29:48.822744 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.822709 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vls\" (UniqueName: \"kubernetes.io/projected/18ef204f-6aa6-4107-8f8f-26e4ab42c428-kube-api-access-44vls\") pod \"migrator-74bb7799d9-db2jz\" (UID: \"18ef204f-6aa6-4107-8f8f-26e4ab42c428\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" Apr 24 21:29:48.899872 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:48.899781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" Apr 24 21:29:49.017655 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:49.017621 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz"] Apr 24 21:29:49.020302 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:29:49.020273 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ef204f_6aa6_4107_8f8f_26e4ab42c428.slice/crio-71e583500d2a1bb20e9d5fc360c3972ce76721a17878b70f0821ebf0e922f175 WatchSource:0}: Error finding container 71e583500d2a1bb20e9d5fc360c3972ce76721a17878b70f0821ebf0e922f175: Status 404 returned error can't find the container with id 71e583500d2a1bb20e9d5fc360c3972ce76721a17878b70f0821ebf0e922f175 Apr 24 21:29:49.368718 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:49.368685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" event={"ID":"18ef204f-6aa6-4107-8f8f-26e4ab42c428","Type":"ContainerStarted","Data":"71e583500d2a1bb20e9d5fc360c3972ce76721a17878b70f0821ebf0e922f175"} Apr 24 21:29:50.372333 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:50.372300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" event={"ID":"18ef204f-6aa6-4107-8f8f-26e4ab42c428","Type":"ContainerStarted","Data":"3381cc1fa4766b8b8318815651212c34b32adfa7dc6e62bb9e9720b08e9aaa3e"} Apr 24 21:29:50.372333 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:50.372336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" event={"ID":"18ef204f-6aa6-4107-8f8f-26e4ab42c428","Type":"ContainerStarted","Data":"05cf3f63f6f12e196e2265434c089cc018b05dd9bc53996d3311b7638b557a43"} Apr 24 21:29:50.389241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:50.389195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-db2jz" podStartSLOduration=1.442039381 podStartE2EDuration="2.389180109s" podCreationTimestamp="2026-04-24 21:29:48 +0000 UTC" firstStartedPulling="2026-04-24 21:29:49.02207089 +0000 UTC m=+142.636341161" lastFinishedPulling="2026-04-24 21:29:49.969211629 +0000 UTC m=+143.583481889" observedRunningTime="2026-04-24 21:29:50.388212317 +0000 UTC m=+144.002482587" watchObservedRunningTime="2026-04-24 21:29:50.389180109 +0000 UTC m=+144.003450387" Apr 24 21:29:51.284010 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.283976 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:51.284010 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.284015 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:29:51.284408 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.284395 2577 scope.go:117] "RemoveContainer" containerID="364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6" Apr 24 21:29:51.284568 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.284551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-96wlk_openshift-console-operator(b14cefbb-8e93-43c4-8a2d-f70afbe6cab4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" podUID="b14cefbb-8e93-43c4-8a2d-f70afbe6cab4" Apr 24 21:29:51.732340 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.732306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:29:51.732787 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.732445 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:51.732787 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.732506 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls podName:d3bee81d-b2d9-4efd-8dd1-045747be92da nodeName:}" failed. No retries permitted until 2026-04-24 21:30:07.732490459 +0000 UTC m=+161.346760735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkjn8" (UID: "d3bee81d-b2d9-4efd-8dd1-045747be92da") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:29:51.833529 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.833476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:51.833709 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:51.833575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:29:51.834174 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.833631 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:29:51.834174 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.834028 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:07.833712352 +0000 UTC m=+161.447982629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : configmap references non-existent config key: service-ca.crt Apr 24 21:29:51.834174 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:29:51.834141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs podName:969ea49e-4e0c-48d5-9e89-dfddef64c993 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:07.834107737 +0000 UTC m=+161.448378009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs") pod "router-default-b4467575d-c7qhh" (UID: "969ea49e-4e0c-48d5-9e89-dfddef64c993") : secret "router-metrics-certs-default" not found Apr 24 21:29:54.654233 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:54.654179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:54.656598 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:54.656578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/00ab5f90-95f6-4c68-b97c-a55985c40e09-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gjg96\" (UID: \"00ab5f90-95f6-4c68-b97c-a55985c40e09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:54.742648 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:54.742599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" Apr 24 21:29:54.856630 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:54.856601 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96"] Apr 24 21:29:55.385207 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:55.385169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" event={"ID":"00ab5f90-95f6-4c68-b97c-a55985c40e09","Type":"ContainerStarted","Data":"982a226678f526ea1c8da53a5dcdb63ca9a799108ea40c0382a966c92fe503fd"} Apr 24 21:29:57.391830 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:57.391792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" event={"ID":"00ab5f90-95f6-4c68-b97c-a55985c40e09","Type":"ContainerStarted","Data":"a1d0f6d898b9452b02de4a807de3cf02adc72c817217785951cc5a3be0663899"} Apr 24 21:29:57.391830 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:57.391830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" event={"ID":"00ab5f90-95f6-4c68-b97c-a55985c40e09","Type":"ContainerStarted","Data":"898008df43552982efd5ba0eda54c083c6fac3dddf97307ce02769e72e523ef4"} Apr 24 21:29:57.412365 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:29:57.412315 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gjg96" podStartSLOduration=17.944048446 podStartE2EDuration="19.41230141s" podCreationTimestamp="2026-04-24 21:29:38 +0000 UTC" firstStartedPulling="2026-04-24 21:29:54.902663456 +0000 UTC m=+148.516933713" lastFinishedPulling="2026-04-24 21:29:56.370916418 +0000 UTC m=+149.985186677" observedRunningTime="2026-04-24 21:29:57.411267524 +0000 UTC m=+151.025537797" watchObservedRunningTime="2026-04-24 21:29:57.41230141 +0000 UTC m=+151.026571689" Apr 24 21:30:02.966957 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:02.966925 2577 scope.go:117] "RemoveContainer" containerID="364d2fa0f71e3a4b8a7ff530547e2ebc869a7fb1339ffca943a7cef82aefc2a6" Apr 24 21:30:03.407889 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:03.407862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:30:03.408054 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:03.407942 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" event={"ID":"b14cefbb-8e93-43c4-8a2d-f70afbe6cab4","Type":"ContainerStarted","Data":"e01ba92369f6d9e43cea4de02c5d21aa98d976a9649ca518e9b21b6af082940d"} Apr 24 21:30:03.408233 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:03.408205 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:30:03.427615 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:03.427574 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" podStartSLOduration=21.79649903 podStartE2EDuration="23.427561763s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:41.411467661 +0000 UTC m=+135.025737918" lastFinishedPulling="2026-04-24 21:29:43.042530391 +0000 UTC m=+136.656800651" observedRunningTime="2026-04-24 21:30:03.426067933 +0000 UTC m=+157.040338574" watchObservedRunningTime="2026-04-24 21:30:03.427561763 +0000 UTC m=+157.041832042" Apr 24 21:30:03.755850 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:03.755773 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-96wlk" Apr 24 21:30:03.784131 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:30:03.784081 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ttnhg" podUID="6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37" Apr 24 21:30:03.799515 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:30:03.799478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zhh6t" podUID="def556d7-437a-4b70-b31e-6643ed89bc7e" Apr 24 21:30:03.981926 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:30:03.981882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jcztz" podUID="b0b45556-212a-460b-a5ae-108beeb6197d" Apr 24 21:30:04.410334 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:04.410295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:07.751594 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.751507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:30:07.753866 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.753840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3bee81d-b2d9-4efd-8dd1-045747be92da-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkjn8\" (UID: \"d3bee81d-b2d9-4efd-8dd1-045747be92da\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:30:07.852348 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.852303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:07.852562 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.852370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:07.852908 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.852887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/969ea49e-4e0c-48d5-9e89-dfddef64c993-service-ca-bundle\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:07.854692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.854666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969ea49e-4e0c-48d5-9e89-dfddef64c993-metrics-certs\") pod \"router-default-b4467575d-c7qhh\" (UID: \"969ea49e-4e0c-48d5-9e89-dfddef64c993\") " pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:07.986946 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:07.986912 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" Apr 24 21:30:08.092612 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.092580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:08.103916 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.103887 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8"] Apr 24 21:30:08.106546 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:08.106487 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bee81d_b2d9_4efd_8dd1_045747be92da.slice/crio-a384dbbf9c7d013b57190f33381ea73a8708032efbb5368d1649adaa3b5ea1ec WatchSource:0}: Error finding container a384dbbf9c7d013b57190f33381ea73a8708032efbb5368d1649adaa3b5ea1ec: Status 404 returned error can't find the container with id a384dbbf9c7d013b57190f33381ea73a8708032efbb5368d1649adaa3b5ea1ec Apr 24 21:30:08.223088 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.223057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b4467575d-c7qhh"] Apr 24 21:30:08.226444 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:08.226409 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969ea49e_4e0c_48d5_9e89_dfddef64c993.slice/crio-f49e32ac8b43c67a922a7e1c042afe4bfaa81a23bb76fa89a5cc5ca38e835297 WatchSource:0}: Error finding container f49e32ac8b43c67a922a7e1c042afe4bfaa81a23bb76fa89a5cc5ca38e835297: Status 404 returned error can't find the container with id f49e32ac8b43c67a922a7e1c042afe4bfaa81a23bb76fa89a5cc5ca38e835297 Apr 24 21:30:08.422570 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.422524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b4467575d-c7qhh" event={"ID":"969ea49e-4e0c-48d5-9e89-dfddef64c993","Type":"ContainerStarted","Data":"72bf6ee1694cad52c52dd6e0fca3ac950a9df0eb698172e150d43f71123eceae"} Apr 24 21:30:08.422570 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.422568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b4467575d-c7qhh" event={"ID":"969ea49e-4e0c-48d5-9e89-dfddef64c993","Type":"ContainerStarted","Data":"f49e32ac8b43c67a922a7e1c042afe4bfaa81a23bb76fa89a5cc5ca38e835297"} Apr 24 21:30:08.423797 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.423770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" event={"ID":"d3bee81d-b2d9-4efd-8dd1-045747be92da","Type":"ContainerStarted","Data":"a384dbbf9c7d013b57190f33381ea73a8708032efbb5368d1649adaa3b5ea1ec"} Apr 24 21:30:08.447143 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.447089 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b4467575d-c7qhh" podStartSLOduration=33.447073628 podStartE2EDuration="33.447073628s" podCreationTimestamp="2026-04-24 21:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:08.446580776 +0000 UTC m=+162.060851056" watchObservedRunningTime="2026-04-24 21:30:08.447073628 +0000 UTC m=+162.061343884" Apr 24 21:30:08.758913 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.758826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:08.758913 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.758897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:30:08.761431 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.761404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37-metrics-tls\") pod \"dns-default-ttnhg\" (UID: \"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37\") " pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:08.761577 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.761554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/def556d7-437a-4b70-b31e-6643ed89bc7e-cert\") pod \"ingress-canary-zhh6t\" (UID: \"def556d7-437a-4b70-b31e-6643ed89bc7e\") " pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:30:08.913868 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.913837 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:30:08.922034 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:08.922006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:09.058001 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.057543 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ttnhg"] Apr 24 21:30:09.060374 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:09.060343 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3bd325_dc2b_4af8_a5a7_0afd9a1fbd37.slice/crio-b3d4bfd72074b0dd5ff7137a5ff538da22185a0c56af8b6c8c82bd82424c108a WatchSource:0}: Error finding container b3d4bfd72074b0dd5ff7137a5ff538da22185a0c56af8b6c8c82bd82424c108a: Status 404 returned error can't find the container with id b3d4bfd72074b0dd5ff7137a5ff538da22185a0c56af8b6c8c82bd82424c108a Apr 24 21:30:09.092900 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.092869 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:09.095671 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.095651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:09.428257 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.428200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ttnhg" event={"ID":"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37","Type":"ContainerStarted","Data":"b3d4bfd72074b0dd5ff7137a5ff538da22185a0c56af8b6c8c82bd82424c108a"} Apr 24 21:30:09.428543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.428522 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:09.429949 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:09.429925 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b4467575d-c7qhh" Apr 24 21:30:10.432751 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:10.432692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" event={"ID":"d3bee81d-b2d9-4efd-8dd1-045747be92da","Type":"ContainerStarted","Data":"3110eae9c954d14423d22f99aaea1b4e31d66d21613e7455290fcc678e8012de"} Apr 24 21:30:10.460464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:10.460406 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkjn8" podStartSLOduration=33.897095553 podStartE2EDuration="35.460387129s" podCreationTimestamp="2026-04-24 21:29:35 +0000 UTC" firstStartedPulling="2026-04-24 21:30:08.108327277 +0000 UTC m=+161.722597534" lastFinishedPulling="2026-04-24 21:30:09.67161885 +0000 UTC m=+163.285889110" observedRunningTime="2026-04-24 21:30:10.459480222 +0000 UTC m=+164.073750502" watchObservedRunningTime="2026-04-24 21:30:10.460387129 +0000 UTC m=+164.074657407" Apr 24 21:30:11.286295 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.286225 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw"] Apr 24 21:30:11.289835 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.289806 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kvj2r"] Apr 24 21:30:11.289987 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.289964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:11.292342 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.292322 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-k7rf7\"" Apr 24 21:30:11.293090 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.293054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:11.293203 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.293127 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:30:11.295462 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.295419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-7xnmx\"" Apr 24 21:30:11.295818 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.295586 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:30:11.295818 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.295787 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:30:11.301963 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.301940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw"] Apr 24 21:30:11.311303 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.311280 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kvj2r"] Apr 24 21:30:11.376801 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.376767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wkgcw\" (UID: \"c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:11.376946 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.376809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkf6r\" (UniqueName: \"kubernetes.io/projected/5701db21-73a5-4846-bd66-5cb8f4331749-kube-api-access-rkf6r\") pod \"downloads-6bcc868b7-kvj2r\" (UID: \"5701db21-73a5-4846-bd66-5cb8f4331749\") " pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:11.395751 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.395719 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pd6d8"] Apr 24 21:30:11.399178 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.399157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.402456 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.402422 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:30:11.402586 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.402485 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:30:11.402586 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.402422 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:30:11.402905 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.402889 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4xl79\"" Apr 24 21:30:11.402995 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.402934 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:30:11.419491 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.419471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pd6d8"] Apr 24 21:30:11.436557 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.436439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ttnhg" event={"ID":"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37","Type":"ContainerStarted","Data":"4804cfd4c5094681672dccbfddf9ebaa688b3cbfb0fb78f0a0da2978694b6d3a"} Apr 24 21:30:11.436952 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.436567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ttnhg" event={"ID":"6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37","Type":"ContainerStarted","Data":"a751c489f76dbf11c2da2c7ed3377b69937779a0174fd8e76766215874dfc312"} Apr 24 21:30:11.436952 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.436821 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:11.458002 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.457958 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ttnhg" podStartSLOduration=129.919989262 podStartE2EDuration="2m11.457945413s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="2026-04-24 21:30:09.062609232 +0000 UTC m=+162.676879489" lastFinishedPulling="2026-04-24 21:30:10.600565383 +0000 UTC m=+164.214835640" observedRunningTime="2026-04-24 21:30:11.457676533 +0000 UTC m=+165.071946812" watchObservedRunningTime="2026-04-24 21:30:11.457945413 +0000 UTC m=+165.072215691" Apr 24 21:30:11.477849 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.477816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1f1f29b-6485-4944-a0a9-b2afb33787d9-data-volume\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.478049 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.478138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkf6r\" (UniqueName: \"kubernetes.io/projected/5701db21-73a5-4846-bd66-5cb8f4331749-kube-api-access-rkf6r\") pod \"downloads-6bcc868b7-kvj2r\" (UID: \"5701db21-73a5-4846-bd66-5cb8f4331749\") " pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:11.478201 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wkgcw\" (UID: \"c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:11.478308 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1f1f29b-6485-4944-a0a9-b2afb33787d9-crio-socket\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.478383 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmslv\" (UniqueName: \"kubernetes.io/projected/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-api-access-hmslv\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.478451 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.478410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1f1f29b-6485-4944-a0a9-b2afb33787d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.480699 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.480673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wkgcw\" (UID: \"c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:11.490412 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.490390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkf6r\" (UniqueName: \"kubernetes.io/projected/5701db21-73a5-4846-bd66-5cb8f4331749-kube-api-access-rkf6r\") pod \"downloads-6bcc868b7-kvj2r\" (UID: \"5701db21-73a5-4846-bd66-5cb8f4331749\") " pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:11.579028 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.578947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579028 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1f1f29b-6485-4944-a0a9-b2afb33787d9-crio-socket\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579243 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmslv\" (UniqueName: \"kubernetes.io/projected/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-api-access-hmslv\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579243 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1f1f29b-6485-4944-a0a9-b2afb33787d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579243 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1f1f29b-6485-4944-a0a9-b2afb33787d9-data-volume\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579243 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d1f1f29b-6485-4944-a0a9-b2afb33787d9-crio-socket\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579471 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d1f1f29b-6485-4944-a0a9-b2afb33787d9-data-volume\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.579557 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.579537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.581368 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.581349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d1f1f29b-6485-4944-a0a9-b2afb33787d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.588995 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.588974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmslv\" (UniqueName: \"kubernetes.io/projected/d1f1f29b-6485-4944-a0a9-b2afb33787d9-kube-api-access-hmslv\") pod \"insights-runtime-extractor-pd6d8\" (UID: \"d1f1f29b-6485-4944-a0a9-b2afb33787d9\") " pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.603239 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.603219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:11.608061 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.608039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:11.709480 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.709451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pd6d8" Apr 24 21:30:11.742589 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.742533 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw"] Apr 24 21:30:11.747125 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:11.747089 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc93a2e52_5f47_430a_80f6_0b8d1ee5ab9a.slice/crio-b82c3a9b671f3d945dce2c1f9e86f3a9bf5749a54ce895c90a455baa10d80357 WatchSource:0}: Error finding container b82c3a9b671f3d945dce2c1f9e86f3a9bf5749a54ce895c90a455baa10d80357: Status 404 returned error can't find the container with id b82c3a9b671f3d945dce2c1f9e86f3a9bf5749a54ce895c90a455baa10d80357 Apr 24 21:30:11.755439 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.755410 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kvj2r"] Apr 24 21:30:11.758792 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:11.758763 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5701db21_73a5_4846_bd66_5cb8f4331749.slice/crio-f0163111e453eaf38bb09e03c8ade6139b699f4c3aa262342733e03d0dba4563 WatchSource:0}: Error finding container f0163111e453eaf38bb09e03c8ade6139b699f4c3aa262342733e03d0dba4563: Status 404 returned error can't find the container with id f0163111e453eaf38bb09e03c8ade6139b699f4c3aa262342733e03d0dba4563 Apr 24 21:30:11.836745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:11.836715 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pd6d8"] Apr 24 21:30:11.839838 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:11.839810 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f1f29b_6485_4944_a0a9_b2afb33787d9.slice/crio-d773f39b9ac3800f492f8d052677e21c3e2df369bfdc5f67fa43e4797249dbf2 WatchSource:0}: Error finding container d773f39b9ac3800f492f8d052677e21c3e2df369bfdc5f67fa43e4797249dbf2: Status 404 returned error can't find the container with id d773f39b9ac3800f492f8d052677e21c3e2df369bfdc5f67fa43e4797249dbf2 Apr 24 21:30:12.441589 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:12.441548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd6d8" event={"ID":"d1f1f29b-6485-4944-a0a9-b2afb33787d9","Type":"ContainerStarted","Data":"3558692dc58bc10bc3ee9882c82be5797809af9426659edb6fe160c30710c7be"} Apr 24 21:30:12.442027 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:12.441594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd6d8" event={"ID":"d1f1f29b-6485-4944-a0a9-b2afb33787d9","Type":"ContainerStarted","Data":"d773f39b9ac3800f492f8d052677e21c3e2df369bfdc5f67fa43e4797249dbf2"} Apr 24 21:30:12.442817 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:12.442787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kvj2r" event={"ID":"5701db21-73a5-4846-bd66-5cb8f4331749","Type":"ContainerStarted","Data":"f0163111e453eaf38bb09e03c8ade6139b699f4c3aa262342733e03d0dba4563"} Apr 24 21:30:12.444512 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:12.444483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" event={"ID":"c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a","Type":"ContainerStarted","Data":"b82c3a9b671f3d945dce2c1f9e86f3a9bf5749a54ce895c90a455baa10d80357"} Apr 24 21:30:13.449594 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:13.449552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd6d8" event={"ID":"d1f1f29b-6485-4944-a0a9-b2afb33787d9","Type":"ContainerStarted","Data":"884fdcf9284c9dd3617ef424e981332938b327fff78057574e5cdae337defe1d"} Apr 24 21:30:13.451665 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:13.451618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" event={"ID":"c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a","Type":"ContainerStarted","Data":"e7e96d1e13dafc1b258f177eb83f7401a9b9883a0d081f0674ebc3a87f4af2e9"} Apr 24 21:30:13.452100 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:13.452071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:13.458422 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:13.458388 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" Apr 24 21:30:13.469216 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:13.469162 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wkgcw" podStartSLOduration=1.366835006 podStartE2EDuration="2.469144785s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="2026-04-24 21:30:11.749433448 +0000 UTC m=+165.363703718" lastFinishedPulling="2026-04-24 21:30:12.851743225 +0000 UTC m=+166.466013497" observedRunningTime="2026-04-24 21:30:13.467498176 +0000 UTC m=+167.081768455" watchObservedRunningTime="2026-04-24 21:30:13.469144785 +0000 UTC m=+167.083415065" Apr 24 21:30:14.052648 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.052610 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:30:14.057077 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.057052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.060638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060293 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:30:14.060638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kvstk\"" Apr 24 21:30:14.060638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060440 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:30:14.060638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:30:14.060964 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060695 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:30:14.060964 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.060818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:30:14.064588 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.064562 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:30:14.101357 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.101543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z77x\" (UniqueName: \"kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.101543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.101543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.101543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.101805 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.101589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202645 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202851 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z77x\" (UniqueName: \"kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202851 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202851 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202851 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.202851 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.202825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.203917 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.203852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.203917 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.203873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.204086 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.203944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.206265 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.206229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.206652 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.206634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.211984 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.211962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z77x\" (UniqueName: \"kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x\") pod \"console-76bc57ff8b-fs9w2\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.295169 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.295125 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5hp6m"] Apr 24 21:30:14.300283 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.300233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.302784 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.302678 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:30:14.302784 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.302683 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:30:14.302784 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.302761 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:30:14.302784 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.302774 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-ghf7j\"" Apr 24 21:30:14.306635 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.306611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5hp6m"] Apr 24 21:30:14.370154 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.370111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:14.404094 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.403885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.404094 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.403946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.404094 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.403991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.404094 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.404032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fd4q\" (UniqueName: \"kubernetes.io/projected/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-kube-api-access-4fd4q\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.504492 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.504455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.504925 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.504527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fd4q\" (UniqueName: \"kubernetes.io/projected/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-kube-api-access-4fd4q\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.504925 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.504621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.504925 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.504669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.506078 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.506032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.507721 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.507695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.508078 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.508050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.513060 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.513037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fd4q\" (UniqueName: \"kubernetes.io/projected/2260b4d6-e6d6-4354-8ff5-52f186f6fdba-kube-api-access-4fd4q\") pod \"prometheus-operator-5676c8c784-5hp6m\" (UID: \"2260b4d6-e6d6-4354-8ff5-52f186f6fdba\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.536213 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.536178 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:30:14.539438 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:14.539400 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30e7a79a_0748_4988_8fd9_9a116182fc1e.slice/crio-e3b9dfbf3e9000f6fefc506a8c344994781cf13c9b66f7a0a750f2fd73a73c0c WatchSource:0}: Error finding container e3b9dfbf3e9000f6fefc506a8c344994781cf13c9b66f7a0a750f2fd73a73c0c: Status 404 returned error can't find the container with id e3b9dfbf3e9000f6fefc506a8c344994781cf13c9b66f7a0a750f2fd73a73c0c Apr 24 21:30:14.612473 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.612438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" Apr 24 21:30:14.742620 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:14.742586 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-5hp6m"] Apr 24 21:30:14.745859 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:14.745832 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2260b4d6_e6d6_4354_8ff5_52f186f6fdba.slice/crio-22291f8a59bc747edcce1da21dfc784eed2fe9e9fe0229e8ad4b05b5cff5098f WatchSource:0}: Error finding container 22291f8a59bc747edcce1da21dfc784eed2fe9e9fe0229e8ad4b05b5cff5098f: Status 404 returned error can't find the container with id 22291f8a59bc747edcce1da21dfc784eed2fe9e9fe0229e8ad4b05b5cff5098f Apr 24 21:30:15.464528 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:15.464457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pd6d8" event={"ID":"d1f1f29b-6485-4944-a0a9-b2afb33787d9","Type":"ContainerStarted","Data":"04e7773fa205330688c412e1bbc9d77f9f76154d6185b4bff5b12de65cb6e88b"} Apr 24 21:30:15.466580 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:15.466536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76bc57ff8b-fs9w2" event={"ID":"30e7a79a-0748-4988-8fd9-9a116182fc1e","Type":"ContainerStarted","Data":"e3b9dfbf3e9000f6fefc506a8c344994781cf13c9b66f7a0a750f2fd73a73c0c"} Apr 24 21:30:15.467772 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:15.467681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" event={"ID":"2260b4d6-e6d6-4354-8ff5-52f186f6fdba","Type":"ContainerStarted","Data":"22291f8a59bc747edcce1da21dfc784eed2fe9e9fe0229e8ad4b05b5cff5098f"} Apr 24 21:30:15.485970 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:15.485763 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pd6d8" podStartSLOduration=1.988531664 podStartE2EDuration="4.485745252s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="2026-04-24 21:30:11.895087102 +0000 UTC m=+165.509357360" lastFinishedPulling="2026-04-24 21:30:14.392300691 +0000 UTC m=+168.006570948" observedRunningTime="2026-04-24 21:30:15.485007026 +0000 UTC m=+169.099277306" watchObservedRunningTime="2026-04-24 21:30:15.485745252 +0000 UTC m=+169.100015533" Apr 24 21:30:16.473544 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:16.473409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" event={"ID":"2260b4d6-e6d6-4354-8ff5-52f186f6fdba","Type":"ContainerStarted","Data":"957de7d57c175aeccda4f77b805bb67935c452b1ed952d9701cdba8722001e29"} Apr 24 21:30:16.473544 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:16.473466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" event={"ID":"2260b4d6-e6d6-4354-8ff5-52f186f6fdba","Type":"ContainerStarted","Data":"76e95437d6055414ef48d3b9ac40c575ea90fbc3013e4b946eebda23783bd027"} Apr 24 21:30:16.501026 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:16.500970 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-5hp6m" podStartSLOduration=1.111381168 podStartE2EDuration="2.500957353s" podCreationTimestamp="2026-04-24 21:30:14 +0000 UTC" firstStartedPulling="2026-04-24 21:30:14.747954387 +0000 UTC m=+168.362224647" lastFinishedPulling="2026-04-24 21:30:16.137530572 +0000 UTC m=+169.751800832" observedRunningTime="2026-04-24 21:30:16.499166942 +0000 UTC m=+170.113437246" watchObservedRunningTime="2026-04-24 21:30:16.500957353 +0000 UTC m=+170.115227623" Apr 24 21:30:16.969604 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:16.969572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:30:17.967172 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:17.967131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:30:17.970235 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:17.970205 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:30:17.977860 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:17.977831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zhh6t" Apr 24 21:30:18.121462 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.121431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zhh6t"] Apr 24 21:30:18.126085 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:18.126043 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef556d7_437a_4b70_b31e_6643ed89bc7e.slice/crio-9d1a86468b52ce820893e3d506eba8de48925d844e725c6ca86a450ade295a70 WatchSource:0}: Error finding container 9d1a86468b52ce820893e3d506eba8de48925d844e725c6ca86a450ade295a70: Status 404 returned error can't find the container with id 9d1a86468b52ce820893e3d506eba8de48925d844e725c6ca86a450ade295a70 Apr 24 21:30:18.481960 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.481882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76bc57ff8b-fs9w2" event={"ID":"30e7a79a-0748-4988-8fd9-9a116182fc1e","Type":"ContainerStarted","Data":"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7"} Apr 24 21:30:18.483131 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.483098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zhh6t" event={"ID":"def556d7-437a-4b70-b31e-6643ed89bc7e","Type":"ContainerStarted","Data":"9d1a86468b52ce820893e3d506eba8de48925d844e725c6ca86a450ade295a70"} Apr 24 21:30:18.526332 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.526237 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76bc57ff8b-fs9w2" podStartSLOduration=1.27236081 podStartE2EDuration="4.526220203s" podCreationTimestamp="2026-04-24 21:30:14 +0000 UTC" firstStartedPulling="2026-04-24 21:30:14.541627833 +0000 UTC m=+168.155898090" lastFinishedPulling="2026-04-24 21:30:17.795487222 +0000 UTC m=+171.409757483" observedRunningTime="2026-04-24 21:30:18.524874688 +0000 UTC m=+172.139144971" watchObservedRunningTime="2026-04-24 21:30:18.526220203 +0000 UTC m=+172.140490481" Apr 24 21:30:18.726147 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.726111 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w6q5b"] Apr 24 21:30:18.730313 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.729905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.732433 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.732375 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:30:18.732433 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.732418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:30:18.733040 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.733002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9mvpt\"" Apr 24 21:30:18.733207 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.733185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:30:18.846353 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-root\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-wtmp\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-sys\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846652 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846652 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846759 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhl67\" (UniqueName: \"kubernetes.io/projected/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-kube-api-access-fhl67\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846759 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-textfile\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846861 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-metrics-client-ca\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.846861 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.846801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-tls\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948019 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.947982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-textfile\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948019 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-metrics-client-ca\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-tls\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-root\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-wtmp\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-sys\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948327 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhl67\" (UniqueName: \"kubernetes.io/projected/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-kube-api-access-fhl67\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-textfile\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-wtmp\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-root\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.948631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.948593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-sys\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.949196 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.949159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.949342 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.949194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-metrics-client-ca\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.951314 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.951280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-tls\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.951542 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.951520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:18.956520 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:18.956493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhl67\" (UniqueName: \"kubernetes.io/projected/137ba05c-a1f1-46e3-b5fc-28b957ed5fc9-kube-api-access-fhl67\") pod \"node-exporter-w6q5b\" (UID: \"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9\") " pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:19.042960 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:19.042868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w6q5b" Apr 24 21:30:19.054002 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:19.053963 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137ba05c_a1f1_46e3_b5fc_28b957ed5fc9.slice/crio-560c0b71b8977e4c366e3da1998ddae8192923ffc486eb8dae623193f9c6fe3a WatchSource:0}: Error finding container 560c0b71b8977e4c366e3da1998ddae8192923ffc486eb8dae623193f9c6fe3a: Status 404 returned error can't find the container with id 560c0b71b8977e4c366e3da1998ddae8192923ffc486eb8dae623193f9c6fe3a Apr 24 21:30:19.487500 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:19.487461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w6q5b" event={"ID":"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9","Type":"ContainerStarted","Data":"560c0b71b8977e4c366e3da1998ddae8192923ffc486eb8dae623193f9c6fe3a"} Apr 24 21:30:20.494951 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:20.494908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w6q5b" event={"ID":"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9","Type":"ContainerStarted","Data":"fd943cbccd3e43f086adecd93c3fa686b588c71c540fe84a2571343297d13827"} Apr 24 21:30:20.497950 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:20.497912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zhh6t" event={"ID":"def556d7-437a-4b70-b31e-6643ed89bc7e","Type":"ContainerStarted","Data":"f91304f8c97a8479f486cd67f4447a876db62e4fa7db53ff3272a598c2402e11"} Apr 24 21:30:21.447548 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:21.447517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ttnhg" Apr 24 21:30:21.485810 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:21.485742 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zhh6t" podStartSLOduration=139.771083003 podStartE2EDuration="2m21.485722395s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="2026-04-24 21:30:18.128400897 +0000 UTC m=+171.742671171" lastFinishedPulling="2026-04-24 21:30:19.843040297 +0000 UTC m=+173.457310563" observedRunningTime="2026-04-24 21:30:20.536622485 +0000 UTC m=+174.150892767" watchObservedRunningTime="2026-04-24 21:30:21.485722395 +0000 UTC m=+175.099992675" Apr 24 21:30:21.503606 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:21.503574 2577 generic.go:358] "Generic (PLEG): container finished" podID="137ba05c-a1f1-46e3-b5fc-28b957ed5fc9" containerID="fd943cbccd3e43f086adecd93c3fa686b588c71c540fe84a2571343297d13827" exitCode=0 Apr 24 21:30:21.504022 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:21.503921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w6q5b" event={"ID":"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9","Type":"ContainerDied","Data":"fd943cbccd3e43f086adecd93c3fa686b588c71c540fe84a2571343297d13827"} Apr 24 21:30:22.731655 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.731616 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-754db4fb4f-dmkdh"] Apr 24 21:30:22.737141 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.736993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.739587 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.739552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-re895los8jbd\"" Apr 24 21:30:22.740097 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.739590 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:30:22.740097 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.739621 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:30:22.740097 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.739693 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ngfcb\"" Apr 24 21:30:22.740097 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.739552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:30:22.740488 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.740459 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:30:22.740587 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.740492 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:30:22.747295 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.747274 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-754db4fb4f-dmkdh"] Apr 24 21:30:22.784995 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.784961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-grpc-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.784995 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.784997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785234 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785234 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785234 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785388 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785331 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf620266-e2bc-4c7f-9f17-6e6bd7623500-metrics-client-ca\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785388 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.785490 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.785390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv4f6\" (UniqueName: \"kubernetes.io/projected/bf620266-e2bc-4c7f-9f17-6e6bd7623500-kube-api-access-cv4f6\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886265 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf620266-e2bc-4c7f-9f17-6e6bd7623500-metrics-client-ca\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv4f6\" (UniqueName: \"kubernetes.io/projected/bf620266-e2bc-4c7f-9f17-6e6bd7623500-kube-api-access-cv4f6\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.886631 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.886607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-grpc-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.887351 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.887311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf620266-e2bc-4c7f-9f17-6e6bd7623500-metrics-client-ca\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.889682 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.889561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.889682 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.889630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-grpc-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.890016 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.889971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.890126 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.890071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.890477 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.890453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.891008 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.890989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bf620266-e2bc-4c7f-9f17-6e6bd7623500-secret-thanos-querier-tls\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:22.895135 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:22.895115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv4f6\" (UniqueName: \"kubernetes.io/projected/bf620266-e2bc-4c7f-9f17-6e6bd7623500-kube-api-access-cv4f6\") pod \"thanos-querier-754db4fb4f-dmkdh\" (UID: \"bf620266-e2bc-4c7f-9f17-6e6bd7623500\") " pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:23.014038 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.013274 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-788b6459f7-2nk2j"] Apr 24 21:30:23.018723 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.018696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.021074 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:30:23.021365 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:30:23.021669 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021590 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7f3hdjkie6ulo\"" Apr 24 21:30:23.021669 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021613 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:30:23.021669 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021597 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:30:23.021669 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.021657 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qhs8n\"" Apr 24 21:30:23.030767 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.030742 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-788b6459f7-2nk2j"] Apr 24 21:30:23.050116 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.050087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:23.088737 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.088918 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-metrics-server-audit-profiles\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.088918 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-client-certs\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.088918 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7f441a5f-864e-4788-b82c-734df5988bbd-audit-log\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.089071 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlcvf\" (UniqueName: \"kubernetes.io/projected/7f441a5f-864e-4788-b82c-734df5988bbd-kube-api-access-dlcvf\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.089071 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088964 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-tls\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.089071 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.088992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-client-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190257 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190451 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-metrics-server-audit-profiles\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190451 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-client-certs\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7f441a5f-864e-4788-b82c-734df5988bbd-audit-log\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlcvf\" (UniqueName: \"kubernetes.io/projected/7f441a5f-864e-4788-b82c-734df5988bbd-kube-api-access-dlcvf\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-tls\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.190638 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.190621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-client-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.191144 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.191053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.191244 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.191167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7f441a5f-864e-4788-b82c-734df5988bbd-audit-log\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.191430 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.191410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7f441a5f-864e-4788-b82c-734df5988bbd-metrics-server-audit-profiles\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.193578 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.193554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-tls\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.193671 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.193583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-client-ca-bundle\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.193671 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.193642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7f441a5f-864e-4788-b82c-734df5988bbd-secret-metrics-server-client-certs\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.199068 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.199041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlcvf\" (UniqueName: \"kubernetes.io/projected/7f441a5f-864e-4788-b82c-734df5988bbd-kube-api-access-dlcvf\") pod \"metrics-server-788b6459f7-2nk2j\" (UID: \"7f441a5f-864e-4788-b82c-734df5988bbd\") " pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.331464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.331429 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:23.961695 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.961659 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-b778d7999-4kq7h"] Apr 24 21:30:23.966725 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.966692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:23.976047 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976009 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:30:23.976190 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976076 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:30:23.976190 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976016 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:30:23.976393 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976375 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:30:23.976491 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:30:23.976573 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.976559 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gzwlz\"" Apr 24 21:30:23.991464 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.991424 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:30:23.999852 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:23.999824 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b778d7999-4kq7h"] Apr 24 21:30:24.101433 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101634 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-federate-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101634 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfhn\" (UniqueName: \"kubernetes.io/projected/a39ae09b-1beb-495e-8642-7487f193d5db-kube-api-access-wlfhn\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101634 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101634 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101816 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-serving-certs-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101816 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.101895 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.101822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-metrics-client-ca\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.202917 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.202874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-metrics-client-ca\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.202940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.202971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-federate-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.203003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfhn\" (UniqueName: \"kubernetes.io/projected/a39ae09b-1beb-495e-8642-7487f193d5db-kube-api-access-wlfhn\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.203038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203109 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.203102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203398 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.203194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-serving-certs-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.203398 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.203347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.205135 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.204716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-metrics-client-ca\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.205135 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.204799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-serving-certs-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.207666 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.206787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.207666 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.206904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-federate-client-tls\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.210796 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.208384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.210796 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.209184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a39ae09b-1beb-495e-8642-7487f193d5db-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.211583 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.211563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a39ae09b-1beb-495e-8642-7487f193d5db-secret-telemeter-client\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.230156 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.230081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfhn\" (UniqueName: \"kubernetes.io/projected/a39ae09b-1beb-495e-8642-7487f193d5db-kube-api-access-wlfhn\") pod \"telemeter-client-b778d7999-4kq7h\" (UID: \"a39ae09b-1beb-495e-8642-7487f193d5db\") " pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.279064 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.279023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" Apr 24 21:30:24.371340 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.371285 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:24.371340 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.371339 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:24.377182 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.377158 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:24.517747 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:24.517658 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:30:25.006312 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.006278 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:25.013265 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.013221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.016738 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016708 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:30:25.016877 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:30:25.016966 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016936 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:30:25.017067 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7q9v6\"" Apr 24 21:30:25.017067 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:30:25.017067 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.017049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:30:25.017272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.017068 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:30:25.017272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:30:25.017272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.016978 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:30:25.017484 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.017385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:30:25.018185 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.018024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-42vgvl3l41224\"" Apr 24 21:30:25.019225 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.019021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:30:25.020509 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.020485 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:30:25.024920 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.024743 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:30:25.037355 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.037325 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:25.114885 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.114855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.114885 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.114893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.114921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.114955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115127 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115268 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115396 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115727 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115727 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115727 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.115727 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.115525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxnp\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.216933 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.216893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.216947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.216970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxnp\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217501 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217981 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217981 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.217981 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.217913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.218868 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.218175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.218868 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.218549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.218868 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.218610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.220954 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.220913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.221936 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.221498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.222083 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.221521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.222192 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.221713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.222295 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.221880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.222646 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.222588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.223583 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.223553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.223957 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.223889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.223957 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.223891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.224538 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.224513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.227900 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.227858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.228090 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.228060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.230335 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.230313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxnp\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp\") pod \"prometheus-k8s-0\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:25.327576 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:25.327460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:28.557924 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:28.557892 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-754db4fb4f-dmkdh"] Apr 24 21:30:28.561097 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:28.561072 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf620266_e2bc_4c7f_9f17_6e6bd7623500.slice/crio-38246b862f19fd15839b8ff5a26fb989c23e5c032dfd17c644ab37cc619f2a87 WatchSource:0}: Error finding container 38246b862f19fd15839b8ff5a26fb989c23e5c032dfd17c644ab37cc619f2a87: Status 404 returned error can't find the container with id 38246b862f19fd15839b8ff5a26fb989c23e5c032dfd17c644ab37cc619f2a87 Apr 24 21:30:28.777723 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:28.777627 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-788b6459f7-2nk2j"] Apr 24 21:30:28.779315 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:28.779290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b778d7999-4kq7h"] Apr 24 21:30:28.781612 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:28.781578 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39ae09b_1beb_495e_8642_7487f193d5db.slice/crio-d45f60470d793c3933c554f21055a4e63498b21afe01da046068481fdb833e51 WatchSource:0}: Error finding container d45f60470d793c3933c554f21055a4e63498b21afe01da046068481fdb833e51: Status 404 returned error can't find the container with id d45f60470d793c3933c554f21055a4e63498b21afe01da046068481fdb833e51 Apr 24 21:30:28.782595 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:28.782530 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f441a5f_864e_4788_b82c_734df5988bbd.slice/crio-f9b003eb0a4449e3215995b8677ad12c10335684440923f1c2e043905291f48d WatchSource:0}: Error finding container f9b003eb0a4449e3215995b8677ad12c10335684440923f1c2e043905291f48d: Status 404 returned error can't find the container with id f9b003eb0a4449e3215995b8677ad12c10335684440923f1c2e043905291f48d Apr 24 21:30:28.784946 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:28.784869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:28.787323 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:28.787294 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c24cbea_f7ef_440f_bcec_361354811b1c.slice/crio-c9350c368ef28c00e5f934486b4173958da58a93529769254bd7d09a5d90f311 WatchSource:0}: Error finding container c9350c368ef28c00e5f934486b4173958da58a93529769254bd7d09a5d90f311: Status 404 returned error can't find the container with id c9350c368ef28c00e5f934486b4173958da58a93529769254bd7d09a5d90f311 Apr 24 21:30:29.533478 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.533434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" event={"ID":"7f441a5f-864e-4788-b82c-734df5988bbd","Type":"ContainerStarted","Data":"f9b003eb0a4449e3215995b8677ad12c10335684440923f1c2e043905291f48d"} Apr 24 21:30:29.535326 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.535293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" event={"ID":"a39ae09b-1beb-495e-8642-7487f193d5db","Type":"ContainerStarted","Data":"d45f60470d793c3933c554f21055a4e63498b21afe01da046068481fdb833e51"} Apr 24 21:30:29.537760 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.537689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"38246b862f19fd15839b8ff5a26fb989c23e5c032dfd17c644ab37cc619f2a87"} Apr 24 21:30:29.540462 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.540409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"c9350c368ef28c00e5f934486b4173958da58a93529769254bd7d09a5d90f311"} Apr 24 21:30:29.548440 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.547156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kvj2r" event={"ID":"5701db21-73a5-4846-bd66-5cb8f4331749","Type":"ContainerStarted","Data":"d264b3eeb9385e6e1df891f4f0441c80666dde6789f8ffe737354552663868a4"} Apr 24 21:30:29.548440 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.547891 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:29.551946 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.551917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w6q5b" event={"ID":"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9","Type":"ContainerStarted","Data":"5da6b7bae4c0eeee050476c99b7feb2d020cdf75352f51adc3fe20abcaa7e035"} Apr 24 21:30:29.552070 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.551948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w6q5b" event={"ID":"137ba05c-a1f1-46e3-b5fc-28b957ed5fc9","Type":"ContainerStarted","Data":"6fbeef95d3e3cb0c9879bf28479b7f73efdeb5360b916d9fbc66e0c4511085c4"} Apr 24 21:30:29.566172 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.565873 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kvj2r" Apr 24 21:30:29.570960 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.570507 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kvj2r" podStartSLOduration=1.814852595 podStartE2EDuration="18.570454355s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="2026-04-24 21:30:11.761426297 +0000 UTC m=+165.375696554" lastFinishedPulling="2026-04-24 21:30:28.517028039 +0000 UTC m=+182.131298314" observedRunningTime="2026-04-24 21:30:29.568838225 +0000 UTC m=+183.183108504" watchObservedRunningTime="2026-04-24 21:30:29.570454355 +0000 UTC m=+183.184724613" Apr 24 21:30:29.591178 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:29.590650 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w6q5b" podStartSLOduration=10.315806086 podStartE2EDuration="11.59053403s" podCreationTimestamp="2026-04-24 21:30:18 +0000 UTC" firstStartedPulling="2026-04-24 21:30:19.05619804 +0000 UTC m=+172.670468300" lastFinishedPulling="2026-04-24 21:30:20.330925972 +0000 UTC m=+173.945196244" observedRunningTime="2026-04-24 21:30:29.590137758 +0000 UTC m=+183.204408038" watchObservedRunningTime="2026-04-24 21:30:29.59053403 +0000 UTC m=+183.204804322" Apr 24 21:30:33.575674 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.573622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" event={"ID":"7f441a5f-864e-4788-b82c-734df5988bbd","Type":"ContainerStarted","Data":"3a6c8524750105b8c001af1b7ad7623c5e5080437297fe32c5b5616e1ebf49c9"} Apr 24 21:30:33.576967 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.576175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" event={"ID":"a39ae09b-1beb-495e-8642-7487f193d5db","Type":"ContainerStarted","Data":"341f05fcf76eda1c1ab6c7936673530d40c42db7ddf751590d0c7ea6d3a10f22"} Apr 24 21:30:33.576967 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.576214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" event={"ID":"a39ae09b-1beb-495e-8642-7487f193d5db","Type":"ContainerStarted","Data":"cb603b9a0b75410e0c06105909902386dee78081f680b0291403edb27110b02a"} Apr 24 21:30:33.579786 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.579733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"a4795789003e4fc336b52606c5b48b899bdad5de21c1c11a2d228f9548e89395"} Apr 24 21:30:33.579786 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.579763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"023eb032e66e6a36d4573ff10d163d6aa8a54dc447b0492c7c378d866694e864"} Apr 24 21:30:33.581439 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.581392 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="83e978ffdb73cd7c15efefd128ca495b37b245af02c671880b7de395b2dcccc7" exitCode=0 Apr 24 21:30:33.581519 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.581449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"83e978ffdb73cd7c15efefd128ca495b37b245af02c671880b7de395b2dcccc7"} Apr 24 21:30:33.598019 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:33.597983 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" podStartSLOduration=7.248199424 podStartE2EDuration="11.597971437s" podCreationTimestamp="2026-04-24 21:30:22 +0000 UTC" firstStartedPulling="2026-04-24 21:30:28.785275608 +0000 UTC m=+182.399545865" lastFinishedPulling="2026-04-24 21:30:33.135047621 +0000 UTC m=+186.749317878" observedRunningTime="2026-04-24 21:30:33.597756959 +0000 UTC m=+187.212027241" watchObservedRunningTime="2026-04-24 21:30:33.597971437 +0000 UTC m=+187.212241716" Apr 24 21:30:34.588194 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:34.588148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" event={"ID":"a39ae09b-1beb-495e-8642-7487f193d5db","Type":"ContainerStarted","Data":"05f90a24c9a50078c674555f2a4495564b2d6b84c3551084bd961ac60e883915"} Apr 24 21:30:34.591218 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:34.591188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"fcb4e2ebc9aec51c73a26ab7f3ce5dbd13fa73712dd8988056acfe5d30c2c468"} Apr 24 21:30:34.615323 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:34.615226 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-b778d7999-4kq7h" podStartSLOduration=7.257513666 podStartE2EDuration="11.615203568s" podCreationTimestamp="2026-04-24 21:30:23 +0000 UTC" firstStartedPulling="2026-04-24 21:30:28.784103782 +0000 UTC m=+182.398374040" lastFinishedPulling="2026-04-24 21:30:33.141793669 +0000 UTC m=+186.756063942" observedRunningTime="2026-04-24 21:30:34.613960596 +0000 UTC m=+188.228230890" watchObservedRunningTime="2026-04-24 21:30:34.615203568 +0000 UTC m=+188.229473849" Apr 24 21:30:35.603238 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.603159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"d388f263f7155fb260f92837b6e3de5bb055c0cd0d5542b3621017ef2baaf057"} Apr 24 21:30:35.604171 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.603701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"be12934e6676ad7de6ab51cdf481a4f277d999bb9d0629617765d82db5d88356"} Apr 24 21:30:35.604171 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.603731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" event={"ID":"bf620266-e2bc-4c7f-9f17-6e6bd7623500","Type":"ContainerStarted","Data":"5c778a2f032dac6e7b0d394b48c95cf61480830a7223bfd6f7cf133e39d34a0c"} Apr 24 21:30:35.628221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.628041 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" podStartSLOduration=7.207655372 podStartE2EDuration="13.628022866s" podCreationTimestamp="2026-04-24 21:30:22 +0000 UTC" firstStartedPulling="2026-04-24 21:30:28.563072172 +0000 UTC m=+182.177342428" lastFinishedPulling="2026-04-24 21:30:34.983439649 +0000 UTC m=+188.597709922" observedRunningTime="2026-04-24 21:30:35.626679749 +0000 UTC m=+189.240950029" watchObservedRunningTime="2026-04-24 21:30:35.628022866 +0000 UTC m=+189.242293146" Apr 24 21:30:35.748150 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.748108 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:30:35.767044 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.767011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.778808 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.778760 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:30:35.781140 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.781117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:30:35.938830 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.938684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njw72\" (UniqueName: \"kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.938830 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.938753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.939119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.938856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.939119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.938920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.939119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.939048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.939119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.939092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:35.939340 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:35.939162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njw72\" (UniqueName: \"kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.040595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.040580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.042227 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.042002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.042227 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.042216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.042951 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.042924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.043470 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.043423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.043722 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.043699 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.047454 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.047411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.051056 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.051029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njw72\" (UniqueName: \"kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72\") pod \"console-589c888867-xrr24\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.082102 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.082064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:36.608463 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:36.608433 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:37.012889 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:37.012696 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:30:37.016026 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:30:37.015973 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9abeb458_bb2b_424e_b0d1_14cc0a9e141e.slice/crio-ba0b46b61629ae4e5cc26437de5ea106d273efa4247c6c8a2e0e3055b757dc51 WatchSource:0}: Error finding container ba0b46b61629ae4e5cc26437de5ea106d273efa4247c6c8a2e0e3055b757dc51: Status 404 returned error can't find the container with id ba0b46b61629ae4e5cc26437de5ea106d273efa4247c6c8a2e0e3055b757dc51 Apr 24 21:30:37.614277 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:37.613140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589c888867-xrr24" event={"ID":"9abeb458-bb2b-424e-b0d1-14cc0a9e141e","Type":"ContainerStarted","Data":"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5"} Apr 24 21:30:37.614277 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:37.613193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589c888867-xrr24" event={"ID":"9abeb458-bb2b-424e-b0d1-14cc0a9e141e","Type":"ContainerStarted","Data":"ba0b46b61629ae4e5cc26437de5ea106d273efa4247c6c8a2e0e3055b757dc51"} Apr 24 21:30:37.617815 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:37.617780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"921afedac06a4d0538f516bf04a268c99fafe613af15364404629a3aebf5699e"} Apr 24 21:30:37.617815 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:37.617827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"d3062a1b3efa611e61c0a22d5193da8673f3eb1dae72d5e1c5826933b7986bde"} Apr 24 21:30:38.624527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.624485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"5e8fb58328449842444f9eae9810e6b0c2b74a7228a1bd0f1e29b10283345b94"} Apr 24 21:30:38.624527 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.624530 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"2b0c76c6a51c376af8905b1d03446310c262133a4df1007ead3145cc71aa0289"} Apr 24 21:30:38.625040 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.624547 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"32224e97545aad0a305cedea9fc8406a7373a35ff4f143514a205b0f9bfc26d3"} Apr 24 21:30:38.625040 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.624558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerStarted","Data":"4764b30ab0dc4841c98a436cf98b5aef5b4e6fbb3bc93b6bb065652bd6993d17"} Apr 24 21:30:38.673733 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.673668 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.162535037 podStartE2EDuration="14.673648247s" podCreationTimestamp="2026-04-24 21:30:24 +0000 UTC" firstStartedPulling="2026-04-24 21:30:28.789184732 +0000 UTC m=+182.403454989" lastFinishedPulling="2026-04-24 21:30:37.300297942 +0000 UTC m=+190.914568199" observedRunningTime="2026-04-24 21:30:38.672745458 +0000 UTC m=+192.287015737" watchObservedRunningTime="2026-04-24 21:30:38.673648247 +0000 UTC m=+192.287918528" Apr 24 21:30:38.675121 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:38.675077 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-589c888867-xrr24" podStartSLOduration=3.675066454 podStartE2EDuration="3.675066454s" podCreationTimestamp="2026-04-24 21:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:37.642980155 +0000 UTC m=+191.257250435" watchObservedRunningTime="2026-04-24 21:30:38.675066454 +0000 UTC m=+192.289336744" Apr 24 21:30:40.327864 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:40.327827 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:42.625426 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:42.625395 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-754db4fb4f-dmkdh" Apr 24 21:30:43.332468 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:43.332419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:43.332645 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:43.332467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:30:46.083039 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:46.082999 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:46.083039 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:46.083041 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:46.087963 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:46.087940 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:46.655608 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:46.655558 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:30:46.708705 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:30:46.708674 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:31:03.337524 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:03.337492 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:31:03.341351 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:03.341327 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-788b6459f7-2nk2j" Apr 24 21:31:11.729580 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:11.729530 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76bc57ff8b-fs9w2" podUID="30e7a79a-0748-4988-8fd9-9a116182fc1e" containerName="console" containerID="cri-o://67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7" gracePeriod=15 Apr 24 21:31:12.021537 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.021513 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76bc57ff8b-fs9w2_30e7a79a-0748-4988-8fd9-9a116182fc1e/console/0.log" Apr 24 21:31:12.021671 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.021589 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:31:12.066773 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.066743 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.066974 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.066807 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z77x\" (UniqueName: \"kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.066974 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.066863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.066974 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.066923 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.066974 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.066966 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.067198 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.067011 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert\") pod \"30e7a79a-0748-4988-8fd9-9a116182fc1e\" (UID: \"30e7a79a-0748-4988-8fd9-9a116182fc1e\") " Apr 24 21:31:12.067477 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.067346 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca" (OuterVolumeSpecName: "service-ca") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:12.067590 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.067515 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config" (OuterVolumeSpecName: "console-config") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:12.067590 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.067552 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:12.069819 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.069784 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:12.069916 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.069835 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:12.070149 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.070125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x" (OuterVolumeSpecName: "kube-api-access-6z77x") pod "30e7a79a-0748-4988-8fd9-9a116182fc1e" (UID: "30e7a79a-0748-4988-8fd9-9a116182fc1e"). InnerVolumeSpecName "kube-api-access-6z77x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:12.168738 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168707 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-service-ca\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.168738 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168737 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-serving-cert\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.168937 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168751 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.168937 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168764 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30e7a79a-0748-4988-8fd9-9a116182fc1e-oauth-serving-cert\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.168937 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168775 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30e7a79a-0748-4988-8fd9-9a116182fc1e-console-oauth-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.168937 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.168788 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6z77x\" (UniqueName: \"kubernetes.io/projected/30e7a79a-0748-4988-8fd9-9a116182fc1e-kube-api-access-6z77x\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:12.736609 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76bc57ff8b-fs9w2_30e7a79a-0748-4988-8fd9-9a116182fc1e/console/0.log" Apr 24 21:31:12.737014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736626 2577 generic.go:358] "Generic (PLEG): container finished" podID="30e7a79a-0748-4988-8fd9-9a116182fc1e" containerID="67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7" exitCode=2 Apr 24 21:31:12.737014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76bc57ff8b-fs9w2" event={"ID":"30e7a79a-0748-4988-8fd9-9a116182fc1e","Type":"ContainerDied","Data":"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7"} Apr 24 21:31:12.737014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736701 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76bc57ff8b-fs9w2" Apr 24 21:31:12.737014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736708 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76bc57ff8b-fs9w2" event={"ID":"30e7a79a-0748-4988-8fd9-9a116182fc1e","Type":"ContainerDied","Data":"e3b9dfbf3e9000f6fefc506a8c344994781cf13c9b66f7a0a750f2fd73a73c0c"} Apr 24 21:31:12.737014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.736728 2577 scope.go:117] "RemoveContainer" containerID="67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7" Apr 24 21:31:12.746188 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.746170 2577 scope.go:117] "RemoveContainer" containerID="67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7" Apr 24 21:31:12.746555 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:31:12.746532 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7\": container with ID starting with 67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7 not found: ID does not exist" containerID="67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7" Apr 24 21:31:12.746640 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.746564 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7"} err="failed to get container status \"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7\": rpc error: code = NotFound desc = could not find container \"67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7\": container with ID starting with 67c5229ef818d76cd5a7311dd1cc4c794503b65a1658cc832ab2ce6566d5bbd7 not found: ID does not exist" Apr 24 21:31:12.769329 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.769235 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:31:12.774697 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.774664 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76bc57ff8b-fs9w2"] Apr 24 21:31:12.971900 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:12.971865 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e7a79a-0748-4988-8fd9-9a116182fc1e" path="/var/lib/kubelet/pods/30e7a79a-0748-4988-8fd9-9a116182fc1e/volumes" Apr 24 21:31:25.328448 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:25.328388 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:25.344302 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:25.344278 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:25.793706 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:25.793673 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:38.712089 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:38.711996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:31:38.714333 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:38.714312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0b45556-212a-460b-a5ae-108beeb6197d-metrics-certs\") pod \"network-metrics-daemon-jcztz\" (UID: \"b0b45556-212a-460b-a5ae-108beeb6197d\") " pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:31:38.872781 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:38.872745 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:31:38.880574 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:38.880551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcztz" Apr 24 21:31:39.037162 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:39.037108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jcztz"] Apr 24 21:31:39.040101 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:31:39.040070 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b45556_212a_460b_a5ae_108beeb6197d.slice/crio-8357f7f1e571a1a5f120fd8a5d8b923ff9b97dd9b8e12f73d270ecba73dd2756 WatchSource:0}: Error finding container 8357f7f1e571a1a5f120fd8a5d8b923ff9b97dd9b8e12f73d270ecba73dd2756: Status 404 returned error can't find the container with id 8357f7f1e571a1a5f120fd8a5d8b923ff9b97dd9b8e12f73d270ecba73dd2756 Apr 24 21:31:39.822056 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:39.822015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcztz" event={"ID":"b0b45556-212a-460b-a5ae-108beeb6197d","Type":"ContainerStarted","Data":"8357f7f1e571a1a5f120fd8a5d8b923ff9b97dd9b8e12f73d270ecba73dd2756"} Apr 24 21:31:40.827193 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:40.827153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcztz" event={"ID":"b0b45556-212a-460b-a5ae-108beeb6197d","Type":"ContainerStarted","Data":"384af0c41c4d6d4248cdc1a1ee3ae787c56642397c7873c1a90e979ecf5db0bd"} Apr 24 21:31:40.827193 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:40.827196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcztz" event={"ID":"b0b45556-212a-460b-a5ae-108beeb6197d","Type":"ContainerStarted","Data":"a4cb1caeb7a04ead4af377fa694f637df23e2768594e505d796134b042f6d091"} Apr 24 21:31:40.847647 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:40.847592 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jcztz" podStartSLOduration=252.808421306 podStartE2EDuration="4m13.847575415s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="2026-04-24 21:31:39.042360975 +0000 UTC m=+252.656631235" lastFinishedPulling="2026-04-24 21:31:40.081515086 +0000 UTC m=+253.695785344" observedRunningTime="2026-04-24 21:31:40.845606825 +0000 UTC m=+254.459877121" watchObservedRunningTime="2026-04-24 21:31:40.847575415 +0000 UTC m=+254.461845693" Apr 24 21:31:43.662273 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.662185 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:43.662982 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.662941 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="prometheus" containerID="cri-o://d3062a1b3efa611e61c0a22d5193da8673f3eb1dae72d5e1c5826933b7986bde" gracePeriod=600 Apr 24 21:31:43.662982 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.662966 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5e8fb58328449842444f9eae9810e6b0c2b74a7228a1bd0f1e29b10283345b94" gracePeriod=600 Apr 24 21:31:43.663221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.663098 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="thanos-sidecar" containerID="cri-o://4764b30ab0dc4841c98a436cf98b5aef5b4e6fbb3bc93b6bb065652bd6993d17" gracePeriod=600 Apr 24 21:31:43.663221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.663182 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy" containerID="cri-o://2b0c76c6a51c376af8905b1d03446310c262133a4df1007ead3145cc71aa0289" gracePeriod=600 Apr 24 21:31:43.663221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.663185 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-web" containerID="cri-o://32224e97545aad0a305cedea9fc8406a7373a35ff4f143514a205b0f9bfc26d3" gracePeriod=600 Apr 24 21:31:43.663430 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.663119 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="config-reloader" containerID="cri-o://921afedac06a4d0538f516bf04a268c99fafe613af15364404629a3aebf5699e" gracePeriod=600 Apr 24 21:31:43.841103 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841073 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="5e8fb58328449842444f9eae9810e6b0c2b74a7228a1bd0f1e29b10283345b94" exitCode=0 Apr 24 21:31:43.841103 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841099 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="2b0c76c6a51c376af8905b1d03446310c262133a4df1007ead3145cc71aa0289" exitCode=0 Apr 24 21:31:43.841103 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841105 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="32224e97545aad0a305cedea9fc8406a7373a35ff4f143514a205b0f9bfc26d3" exitCode=0 Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841111 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="4764b30ab0dc4841c98a436cf98b5aef5b4e6fbb3bc93b6bb065652bd6993d17" exitCode=0 Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841119 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="921afedac06a4d0538f516bf04a268c99fafe613af15364404629a3aebf5699e" exitCode=0 Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841129 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerID="d3062a1b3efa611e61c0a22d5193da8673f3eb1dae72d5e1c5826933b7986bde" exitCode=0 Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"5e8fb58328449842444f9eae9810e6b0c2b74a7228a1bd0f1e29b10283345b94"} Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"2b0c76c6a51c376af8905b1d03446310c262133a4df1007ead3145cc71aa0289"} Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"32224e97545aad0a305cedea9fc8406a7373a35ff4f143514a205b0f9bfc26d3"} Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"4764b30ab0dc4841c98a436cf98b5aef5b4e6fbb3bc93b6bb065652bd6993d17"} Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841203 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"921afedac06a4d0538f516bf04a268c99fafe613af15364404629a3aebf5699e"} Apr 24 21:31:43.841400 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.841213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"d3062a1b3efa611e61c0a22d5193da8673f3eb1dae72d5e1c5826933b7986bde"} Apr 24 21:31:43.931800 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.931773 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957266 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957311 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957348 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957372 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957404 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957432 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957502 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957550 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957571 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957594 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957618 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxnp\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957675 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957727 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957754 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957810 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.958907 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.957838 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db\") pod \"4c24cbea-f7ef-440f-bcec-361354811b1c\" (UID: \"4c24cbea-f7ef-440f-bcec-361354811b1c\") " Apr 24 21:31:43.959950 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.959050 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:43.959950 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.959550 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:43.959950 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.959818 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:43.960976 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.960924 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:43.962241 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.962140 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.962543 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.962511 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:43.962969 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.962787 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config" (OuterVolumeSpecName: "config") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.963163 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.963135 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:43.963625 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.963342 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.964833 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.964789 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp" (OuterVolumeSpecName: "kube-api-access-dvxnp") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "kube-api-access-dvxnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:43.965087 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.965047 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.965565 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.965512 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out" (OuterVolumeSpecName: "config-out") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:43.968525 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.967713 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.968525 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.968034 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.969134 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.969095 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.969339 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.969317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:43.969339 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.969317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:43.980454 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:43.980423 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config" (OuterVolumeSpecName: "web-config") pod "4c24cbea-f7ef-440f-bcec-361354811b1c" (UID: "4c24cbea-f7ef-440f-bcec-361354811b1c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:44.059078 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059037 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-tls-assets\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059078 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059073 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059078 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059083 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059094 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvxnp\" (UniqueName: \"kubernetes.io/projected/4c24cbea-f7ef-440f-bcec-361354811b1c-kube-api-access-dvxnp\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059104 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059113 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-web-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059121 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-config-out\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059129 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059137 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-metrics-client-ca\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059147 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-db\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059155 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-kube-rbac-proxy\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059164 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-grpc-tls\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059172 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-metrics-client-certs\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059180 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059190 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059198 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059207 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c24cbea-f7ef-440f-bcec-361354811b1c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.059352 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.059216 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4c24cbea-f7ef-440f-bcec-361354811b1c-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:31:44.846703 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.846663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4c24cbea-f7ef-440f-bcec-361354811b1c","Type":"ContainerDied","Data":"c9350c368ef28c00e5f934486b4173958da58a93529769254bd7d09a5d90f311"} Apr 24 21:31:44.846703 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.846698 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.847188 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.846721 2577 scope.go:117] "RemoveContainer" containerID="5e8fb58328449842444f9eae9810e6b0c2b74a7228a1bd0f1e29b10283345b94" Apr 24 21:31:44.855297 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.855276 2577 scope.go:117] "RemoveContainer" containerID="2b0c76c6a51c376af8905b1d03446310c262133a4df1007ead3145cc71aa0289" Apr 24 21:31:44.862867 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.862846 2577 scope.go:117] "RemoveContainer" containerID="32224e97545aad0a305cedea9fc8406a7373a35ff4f143514a205b0f9bfc26d3" Apr 24 21:31:44.869167 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.869139 2577 scope.go:117] "RemoveContainer" containerID="4764b30ab0dc4841c98a436cf98b5aef5b4e6fbb3bc93b6bb065652bd6993d17" Apr 24 21:31:44.872182 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.872162 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:44.876943 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.876921 2577 scope.go:117] "RemoveContainer" containerID="921afedac06a4d0538f516bf04a268c99fafe613af15364404629a3aebf5699e" Apr 24 21:31:44.877056 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.877040 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:44.883578 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.883557 2577 scope.go:117] "RemoveContainer" containerID="d3062a1b3efa611e61c0a22d5193da8673f3eb1dae72d5e1c5826933b7986bde" Apr 24 21:31:44.890486 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.890471 2577 scope.go:117] "RemoveContainer" containerID="83e978ffdb73cd7c15efefd128ca495b37b245af02c671880b7de395b2dcccc7" Apr 24 21:31:44.910140 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910116 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:44.910476 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910462 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30e7a79a-0748-4988-8fd9-9a116182fc1e" containerName="console" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910478 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e7a79a-0748-4988-8fd9-9a116182fc1e" containerName="console" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910487 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910493 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910500 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="init-config-reloader" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910513 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="init-config-reloader" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910522 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy" Apr 24 21:31:44.910533 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910527 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910540 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="config-reloader" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910546 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="config-reloader" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910560 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="prometheus" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910565 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="prometheus" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910570 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="thanos-sidecar" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910575 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="thanos-sidecar" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910581 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-web" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910586 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-web" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910631 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="prometheus" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910638 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="30e7a79a-0748-4988-8fd9-9a116182fc1e" containerName="console" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910645 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="thanos-sidecar" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910651 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-web" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910657 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910666 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:44.910765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.910672 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" containerName="config-reloader" Apr 24 21:31:44.915951 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.915934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.919557 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.919541 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:31:44.919658 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.919613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:31:44.919980 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.919935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-42vgvl3l41224\"" Apr 24 21:31:44.919980 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.919952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:31:44.920164 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:31:44.920337 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:31:44.920337 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920330 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:31:44.920483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:31:44.920483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920335 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:31:44.920483 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920374 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:31:44.920782 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:31:44.920863 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.920837 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7q9v6\"" Apr 24 21:31:44.925873 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.925853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:31:44.932756 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.932677 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:31:44.934119 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.934099 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:44.967191 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967191 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967619 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967619 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967619 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967712 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967712 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967712 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967814 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967814 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967814 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrj6d\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-kube-api-access-jrj6d\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.967915 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.967867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:44.972618 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:44.972593 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c24cbea-f7ef-440f-bcec-361354811b1c" path="/var/lib/kubelet/pods/4c24cbea-f7ef-440f-bcec-361354811b1c/volumes" Apr 24 21:31:45.069237 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069237 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069529 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069529 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069529 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069529 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069745 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.069993 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.069995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrj6d\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-kube-api-access-jrj6d\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.070412 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.070231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.070412 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.070241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.072481 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.072455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.072614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.072547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.072679 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.072625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.072778 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.072755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.073037 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.072909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.073136 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.073088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.073325 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.073236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.073853 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.073757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.073853 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.073804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.074138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.074119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.075072 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.075047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.075350 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.075332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.075595 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.075573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.076022 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.076006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.077986 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.077970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrj6d\" (UniqueName: \"kubernetes.io/projected/b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd-kube-api-access-jrj6d\") pod \"prometheus-k8s-0\" (UID: \"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.226098 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.226007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:45.352765 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.352738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:45.354701 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:31:45.354674 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bb4f5f_8724_4bdd_8ba1_d0b27efb5cbd.slice/crio-a51b7a160b32a217d6e8a467cb509daabd5663dd36b066ad63afdc4db8be8c3c WatchSource:0}: Error finding container a51b7a160b32a217d6e8a467cb509daabd5663dd36b066ad63afdc4db8be8c3c: Status 404 returned error can't find the container with id a51b7a160b32a217d6e8a467cb509daabd5663dd36b066ad63afdc4db8be8c3c Apr 24 21:31:45.855508 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.855463 2577 generic.go:358] "Generic (PLEG): container finished" podID="b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd" containerID="38b7d88321d767a314c074658925beea45ba0d3730ac1a207c74c833153b57bc" exitCode=0 Apr 24 21:31:45.855930 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.855546 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerDied","Data":"38b7d88321d767a314c074658925beea45ba0d3730ac1a207c74c833153b57bc"} Apr 24 21:31:45.855930 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:45.855586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"a51b7a160b32a217d6e8a467cb509daabd5663dd36b066ad63afdc4db8be8c3c"} Apr 24 21:31:46.863437 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"73e1273bf2a6bc0fcd6e7b1c2aa68923038cea811856c7c9e26d7ebbe1f8a171"} Apr 24 21:31:46.863437 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"f9c2925c0810b1c5c6713574a2e74c3046eec8642b05114e3f74f82bcb316711"} Apr 24 21:31:46.863437 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"c601750f752caaa2f01acae5a669e4aaf72aec36c2f3116ae2c9fd835962eb7e"} Apr 24 21:31:46.863881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"af213e844520dddfb013a0cc25fb9b494de2c7942d7cdb31a7328a28746f97e0"} Apr 24 21:31:46.863881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"c2139c8881fe422e66420fdae32c148e1a34df5f48fbcd430b49f790c21c2fa4"} Apr 24 21:31:46.863881 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.863470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd","Type":"ContainerStarted","Data":"f5a45d47eb395acdc81dbd819897e982f229dc6154b56fbd84cc7d3c67b7705c"} Apr 24 21:31:46.899160 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:46.899102 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.899084303 podStartE2EDuration="2.899084303s" podCreationTimestamp="2026-04-24 21:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:46.896978609 +0000 UTC m=+260.511248899" watchObservedRunningTime="2026-04-24 21:31:46.899084303 +0000 UTC m=+260.513354584" Apr 24 21:31:50.227091 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:50.227055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:54.429861 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:31:54.429818 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:32:19.451166 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.451101 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-589c888867-xrr24" podUID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" containerName="console" containerID="cri-o://78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5" gracePeriod=15 Apr 24 21:32:19.685236 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.685214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-589c888867-xrr24_9abeb458-bb2b-424e-b0d1-14cc0a9e141e/console/0.log" Apr 24 21:32:19.685379 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.685289 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:32:19.762009 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.761924 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njw72\" (UniqueName: \"kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762009 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.761957 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762032 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762063 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762221 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762189 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762405 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762235 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762405 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762312 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config\") pod \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\" (UID: \"9abeb458-bb2b-424e-b0d1-14cc0a9e141e\") " Apr 24 21:32:19.762515 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762441 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:19.762630 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca" (OuterVolumeSpecName: "service-ca") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:19.762751 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762726 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-service-ca\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.762866 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762761 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-trusted-ca-bundle\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.762866 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762719 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:19.762866 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.762741 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config" (OuterVolumeSpecName: "console-config") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:19.764162 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.764128 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:19.764272 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.764209 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:19.764321 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.764278 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72" (OuterVolumeSpecName: "kube-api-access-njw72") pod "9abeb458-bb2b-424e-b0d1-14cc0a9e141e" (UID: "9abeb458-bb2b-424e-b0d1-14cc0a9e141e"). InnerVolumeSpecName "kube-api-access-njw72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:19.864158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.864126 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-serving-cert\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.864158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.864153 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-oauth-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.864158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.864162 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-oauth-serving-cert\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.864392 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.864171 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-console-config\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.864392 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.864181 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-njw72\" (UniqueName: \"kubernetes.io/projected/9abeb458-bb2b-424e-b0d1-14cc0a9e141e-kube-api-access-njw72\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:32:19.961006 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.960977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-589c888867-xrr24_9abeb458-bb2b-424e-b0d1-14cc0a9e141e/console/0.log" Apr 24 21:32:19.961183 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.961014 2577 generic.go:358] "Generic (PLEG): container finished" podID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" containerID="78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5" exitCode=2 Apr 24 21:32:19.961183 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.961058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589c888867-xrr24" event={"ID":"9abeb458-bb2b-424e-b0d1-14cc0a9e141e","Type":"ContainerDied","Data":"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5"} Apr 24 21:32:19.961183 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.961081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589c888867-xrr24" event={"ID":"9abeb458-bb2b-424e-b0d1-14cc0a9e141e","Type":"ContainerDied","Data":"ba0b46b61629ae4e5cc26437de5ea106d273efa4247c6c8a2e0e3055b757dc51"} Apr 24 21:32:19.961183 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.961095 2577 scope.go:117] "RemoveContainer" containerID="78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5" Apr 24 21:32:19.961183 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.961094 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589c888867-xrr24" Apr 24 21:32:19.970097 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.970075 2577 scope.go:117] "RemoveContainer" containerID="78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5" Apr 24 21:32:19.970435 ip-10-0-129-36 kubenswrapper[2577]: E0424 21:32:19.970411 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5\": container with ID starting with 78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5 not found: ID does not exist" containerID="78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5" Apr 24 21:32:19.970511 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.970445 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5"} err="failed to get container status \"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5\": rpc error: code = NotFound desc = could not find container \"78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5\": container with ID starting with 78b18551e14ecd97d2d8f02f2808e8a705f22a7b2624cf3f9bb242a3e32585c5 not found: ID does not exist" Apr 24 21:32:19.989149 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.989124 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:32:19.994550 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:19.994520 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-589c888867-xrr24"] Apr 24 21:32:20.971050 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:20.971012 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" path="/var/lib/kubelet/pods/9abeb458-bb2b-424e-b0d1-14cc0a9e141e/volumes" Apr 24 21:32:26.877155 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:26.877122 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:32:26.877902 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:26.877874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:32:26.881121 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:26.881100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:32:26.881550 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:26.881530 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:32:26.887567 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:26.887544 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:45.226435 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:45.226397 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:45.241850 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:45.241818 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:46.048332 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:32:46.048305 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:35:45.242604 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.242571 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-rqzzm"] Apr 24 21:35:45.243034 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.242886 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" containerName="console" Apr 24 21:35:45.243034 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.242896 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" containerName="console" Apr 24 21:35:45.243034 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.242954 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9abeb458-bb2b-424e-b0d1-14cc0a9e141e" containerName="console" Apr 24 21:35:45.245782 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.245763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rqzzm" Apr 24 21:35:45.250146 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.250126 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z6l4z\"" Apr 24 21:35:45.250694 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.250675 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:35:45.252158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.251775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:45.252158 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.251795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:45.258619 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.258596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rqzzm"] Apr 24 21:35:45.332722 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.332691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knm8d\" (UniqueName: \"kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d\") pod \"s3-init-rqzzm\" (UID: \"34d01e73-a26b-409a-9160-fb65661c3301\") " pod="kserve/s3-init-rqzzm" Apr 24 21:35:45.433132 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.433076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knm8d\" (UniqueName: \"kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d\") pod \"s3-init-rqzzm\" (UID: \"34d01e73-a26b-409a-9160-fb65661c3301\") " pod="kserve/s3-init-rqzzm" Apr 24 21:35:45.448809 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.448782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knm8d\" (UniqueName: \"kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d\") pod \"s3-init-rqzzm\" (UID: \"34d01e73-a26b-409a-9160-fb65661c3301\") " pod="kserve/s3-init-rqzzm" Apr 24 21:35:45.562991 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.562910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rqzzm" Apr 24 21:35:45.684092 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.684062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-rqzzm"] Apr 24 21:35:45.686423 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:35:45.686383 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d01e73_a26b_409a_9160_fb65661c3301.slice/crio-f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7 WatchSource:0}: Error finding container f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7: Status 404 returned error can't find the container with id f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7 Apr 24 21:35:45.688056 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:45.688037 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:35:46.555014 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:46.554963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rqzzm" event={"ID":"34d01e73-a26b-409a-9160-fb65661c3301","Type":"ContainerStarted","Data":"f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7"} Apr 24 21:35:50.568145 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:50.568056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rqzzm" event={"ID":"34d01e73-a26b-409a-9160-fb65661c3301","Type":"ContainerStarted","Data":"c2c8d1d7e7f6fa9676fc80d3e835ee9c9f5f0603299480d57660f5e7c290f460"} Apr 24 21:35:50.584393 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:50.584340 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-rqzzm" podStartSLOduration=1.083249274 podStartE2EDuration="5.584321582s" podCreationTimestamp="2026-04-24 21:35:45 +0000 UTC" firstStartedPulling="2026-04-24 21:35:45.688171199 +0000 UTC m=+499.302441457" lastFinishedPulling="2026-04-24 21:35:50.189243502 +0000 UTC m=+503.803513765" observedRunningTime="2026-04-24 21:35:50.582898835 +0000 UTC m=+504.197169115" watchObservedRunningTime="2026-04-24 21:35:50.584321582 +0000 UTC m=+504.198591861" Apr 24 21:35:53.579181 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:53.579151 2577 generic.go:358] "Generic (PLEG): container finished" podID="34d01e73-a26b-409a-9160-fb65661c3301" containerID="c2c8d1d7e7f6fa9676fc80d3e835ee9c9f5f0603299480d57660f5e7c290f460" exitCode=0 Apr 24 21:35:53.579567 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:53.579226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rqzzm" event={"ID":"34d01e73-a26b-409a-9160-fb65661c3301","Type":"ContainerDied","Data":"c2c8d1d7e7f6fa9676fc80d3e835ee9c9f5f0603299480d57660f5e7c290f460"} Apr 24 21:35:54.705906 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:54.705885 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rqzzm" Apr 24 21:35:54.811173 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:54.811135 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knm8d\" (UniqueName: \"kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d\") pod \"34d01e73-a26b-409a-9160-fb65661c3301\" (UID: \"34d01e73-a26b-409a-9160-fb65661c3301\") " Apr 24 21:35:54.813345 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:54.813309 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d" (OuterVolumeSpecName: "kube-api-access-knm8d") pod "34d01e73-a26b-409a-9160-fb65661c3301" (UID: "34d01e73-a26b-409a-9160-fb65661c3301"). InnerVolumeSpecName "kube-api-access-knm8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:54.911936 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:54.911852 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-knm8d\" (UniqueName: \"kubernetes.io/projected/34d01e73-a26b-409a-9160-fb65661c3301-kube-api-access-knm8d\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:35:55.586033 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:55.585995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-rqzzm" event={"ID":"34d01e73-a26b-409a-9160-fb65661c3301","Type":"ContainerDied","Data":"f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7"} Apr 24 21:35:55.586033 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:55.586028 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-rqzzm" Apr 24 21:35:55.586261 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:35:55.586035 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6a1ced6a22d1b72a6e08f08d7ebcd731b1dcc9b0f80062d3aabda515efa06d7" Apr 24 21:36:02.782347 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.782306 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-qwgx9"] Apr 24 21:36:02.782743 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.782644 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34d01e73-a26b-409a-9160-fb65661c3301" containerName="s3-init" Apr 24 21:36:02.782743 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.782655 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d01e73-a26b-409a-9160-fb65661c3301" containerName="s3-init" Apr 24 21:36:02.782743 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.782713 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="34d01e73-a26b-409a-9160-fb65661c3301" containerName="s3-init" Apr 24 21:36:02.785857 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.785838 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:02.788378 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.788354 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:36:02.788487 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.788448 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z6l4z\"" Apr 24 21:36:02.789188 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.789172 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:36:02.789240 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.789223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:36:02.795006 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.794984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-qwgx9"] Apr 24 21:36:02.879796 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.879756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brtk\" (UniqueName: \"kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk\") pod \"s3-tls-init-custom-qwgx9\" (UID: \"2ecd9ac8-1162-499e-91fe-bf91a1c300a3\") " pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:02.981138 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.981097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7brtk\" (UniqueName: \"kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk\") pod \"s3-tls-init-custom-qwgx9\" (UID: \"2ecd9ac8-1162-499e-91fe-bf91a1c300a3\") " pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:02.990692 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:02.990661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brtk\" (UniqueName: \"kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk\") pod \"s3-tls-init-custom-qwgx9\" (UID: \"2ecd9ac8-1162-499e-91fe-bf91a1c300a3\") " pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:03.106056 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:03.106012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:03.231214 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:03.231186 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-qwgx9"] Apr 24 21:36:03.233953 ip-10-0-129-36 kubenswrapper[2577]: W0424 21:36:03.233922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ecd9ac8_1162_499e_91fe_bf91a1c300a3.slice/crio-ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e WatchSource:0}: Error finding container ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e: Status 404 returned error can't find the container with id ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e Apr 24 21:36:03.610797 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:03.610754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-qwgx9" event={"ID":"2ecd9ac8-1162-499e-91fe-bf91a1c300a3","Type":"ContainerStarted","Data":"4efca4cacdff9829e53404ecdb779c75620128e05fe304294f56c0403baf2d23"} Apr 24 21:36:03.610797 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:03.610799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-qwgx9" event={"ID":"2ecd9ac8-1162-499e-91fe-bf91a1c300a3","Type":"ContainerStarted","Data":"ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e"} Apr 24 21:36:03.628494 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:03.628425 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-qwgx9" podStartSLOduration=1.628407003 podStartE2EDuration="1.628407003s" podCreationTimestamp="2026-04-24 21:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:03.625807233 +0000 UTC m=+517.240077513" watchObservedRunningTime="2026-04-24 21:36:03.628407003 +0000 UTC m=+517.242677281" Apr 24 21:36:08.627648 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:08.627614 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ecd9ac8-1162-499e-91fe-bf91a1c300a3" containerID="4efca4cacdff9829e53404ecdb779c75620128e05fe304294f56c0403baf2d23" exitCode=0 Apr 24 21:36:08.628019 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:08.627687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-qwgx9" event={"ID":"2ecd9ac8-1162-499e-91fe-bf91a1c300a3","Type":"ContainerDied","Data":"4efca4cacdff9829e53404ecdb779c75620128e05fe304294f56c0403baf2d23"} Apr 24 21:36:09.756461 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:09.756438 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:09.834821 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:09.834785 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7brtk\" (UniqueName: \"kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk\") pod \"2ecd9ac8-1162-499e-91fe-bf91a1c300a3\" (UID: \"2ecd9ac8-1162-499e-91fe-bf91a1c300a3\") " Apr 24 21:36:09.836772 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:09.836750 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk" (OuterVolumeSpecName: "kube-api-access-7brtk") pod "2ecd9ac8-1162-499e-91fe-bf91a1c300a3" (UID: "2ecd9ac8-1162-499e-91fe-bf91a1c300a3"). InnerVolumeSpecName "kube-api-access-7brtk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:09.935614 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:09.935535 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7brtk\" (UniqueName: \"kubernetes.io/projected/2ecd9ac8-1162-499e-91fe-bf91a1c300a3-kube-api-access-7brtk\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 21:36:10.636192 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:10.636161 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-qwgx9" Apr 24 21:36:10.636374 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:10.636159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-qwgx9" event={"ID":"2ecd9ac8-1162-499e-91fe-bf91a1c300a3","Type":"ContainerDied","Data":"ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e"} Apr 24 21:36:10.636374 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:36:10.636287 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff186f9affe64ddabadd81bc95e5549aeb1f113d3164d84e2f67a2567895480e" Apr 24 21:37:26.904236 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:37:26.904210 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:37:26.907309 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:37:26.907282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:37:26.907434 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:37:26.907318 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:37:26.910488 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:37:26.910471 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:42:26.926189 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:42:26.926155 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:42:26.929535 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:42:26.929503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:42:26.930466 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:42:26.930439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:42:26.933912 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:42:26.933888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:47:26.948036 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:47:26.948006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:47:26.952540 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:47:26.952515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:47:26.958080 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:47:26.958058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:47:26.961470 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:47:26.961452 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:52:26.974482 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:52:26.974452 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:52:26.977408 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:52:26.977384 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:52:26.981059 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:52:26.981041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:52:26.983942 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:52:26.983922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:57:26.998204 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:57:26.998176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:57:27.001584 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:57:27.001556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 21:57:27.004579 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:57:27.004556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 21:57:27.007346 ip-10-0-129-36 kubenswrapper[2577]: I0424 21:57:27.007328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:02:27.020364 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:02:27.020329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:02:27.023455 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:02:27.023433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:02:27.026726 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:02:27.026706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:02:27.029809 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:02:27.029792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:07:27.042986 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:07:27.042954 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:07:27.045965 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:07:27.045935 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:07:27.052267 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:07:27.052228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:07:27.055493 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:07:27.055475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:12:27.068714 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:12:27.068683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:12:27.072230 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:12:27.072197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:12:27.078040 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:12:27.078018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:12:27.080866 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:12:27.080846 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:17:27.091775 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:17:27.091744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:17:27.095224 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:17:27.095198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:17:27.100521 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:17:27.100499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:17:27.103393 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:17:27.103374 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:22:27.114342 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:22:27.114305 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:22:27.117731 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:22:27.117706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:22:27.122971 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:22:27.122953 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:22:27.125897 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:22:27.125880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:27:27.136626 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:27:27.136595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:27:27.139722 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:27:27.139704 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:27:27.144298 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:27:27.144283 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:27:27.147122 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:27:27.147106 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:30:43.837532 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.837490 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4rgmg/must-gather-88m7q"] Apr 24 22:30:43.837995 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.837832 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ecd9ac8-1162-499e-91fe-bf91a1c300a3" containerName="s3-tls-init-custom" Apr 24 22:30:43.837995 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.837844 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecd9ac8-1162-499e-91fe-bf91a1c300a3" containerName="s3-tls-init-custom" Apr 24 22:30:43.837995 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.837900 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ecd9ac8-1162-499e-91fe-bf91a1c300a3" containerName="s3-tls-init-custom" Apr 24 22:30:43.841108 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.841091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:43.844330 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.844309 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4rgmg\"/\"openshift-service-ca.crt\"" Apr 24 22:30:43.844453 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.844345 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4rgmg\"/\"kube-root-ca.crt\"" Apr 24 22:30:43.852606 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.852587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgmg/must-gather-88m7q"] Apr 24 22:30:43.939144 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.939110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:43.939358 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:43.939233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrj4\" (UniqueName: \"kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.040310 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.040275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrj4\" (UniqueName: \"kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.040466 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.040361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.040662 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.040647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.050657 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.050630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrj4\" (UniqueName: \"kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4\") pod \"must-gather-88m7q\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.161487 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.161398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:30:44.282901 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.282868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgmg/must-gather-88m7q"] Apr 24 22:30:44.285810 ip-10-0-129-36 kubenswrapper[2577]: W0424 22:30:44.285779 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c1557a_06c5_4185_a5bc_8da8018300fd.slice/crio-65fa7e410bc45549582bfede0e1a33c3939849d0c1f9ede6c5fa01b2b2a5824e WatchSource:0}: Error finding container 65fa7e410bc45549582bfede0e1a33c3939849d0c1f9ede6c5fa01b2b2a5824e: Status 404 returned error can't find the container with id 65fa7e410bc45549582bfede0e1a33c3939849d0c1f9ede6c5fa01b2b2a5824e Apr 24 22:30:44.287439 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:44.287422 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:30:45.184810 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:45.184772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgmg/must-gather-88m7q" event={"ID":"45c1557a-06c5-4185-a5bc-8da8018300fd","Type":"ContainerStarted","Data":"65fa7e410bc45549582bfede0e1a33c3939849d0c1f9ede6c5fa01b2b2a5824e"} Apr 24 22:30:49.200205 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:49.200163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgmg/must-gather-88m7q" event={"ID":"45c1557a-06c5-4185-a5bc-8da8018300fd","Type":"ContainerStarted","Data":"32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034"} Apr 24 22:30:49.200205 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:49.200206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgmg/must-gather-88m7q" event={"ID":"45c1557a-06c5-4185-a5bc-8da8018300fd","Type":"ContainerStarted","Data":"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee"} Apr 24 22:30:49.217570 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:30:49.217524 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4rgmg/must-gather-88m7q" podStartSLOduration=2.212352979 podStartE2EDuration="6.217509838s" podCreationTimestamp="2026-04-24 22:30:43 +0000 UTC" firstStartedPulling="2026-04-24 22:30:44.287565505 +0000 UTC m=+3797.901835765" lastFinishedPulling="2026-04-24 22:30:48.292722367 +0000 UTC m=+3801.906992624" observedRunningTime="2026-04-24 22:30:49.216711447 +0000 UTC m=+3802.830981750" watchObservedRunningTime="2026-04-24 22:30:49.217509838 +0000 UTC m=+3802.831780117" Apr 24 22:31:10.268362 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:10.268327 2577 generic.go:358] "Generic (PLEG): container finished" podID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerID="ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee" exitCode=0 Apr 24 22:31:10.268773 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:10.268393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgmg/must-gather-88m7q" event={"ID":"45c1557a-06c5-4185-a5bc-8da8018300fd","Type":"ContainerDied","Data":"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee"} Apr 24 22:31:10.268773 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:10.268680 2577 scope.go:117] "RemoveContainer" containerID="ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee" Apr 24 22:31:10.433497 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:10.433463 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rgmg_must-gather-88m7q_45c1557a-06c5-4185-a5bc-8da8018300fd/gather/0.log" Apr 24 22:31:14.058422 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:14.058388 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pvftk_20dab956-3a56-4a27-bcee-9f75822a7970/global-pull-secret-syncer/0.log" Apr 24 22:31:14.187706 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:14.187673 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g5vd4_67dbaaba-431e-4e09-9019-650f32d8999d/konnectivity-agent/0.log" Apr 24 22:31:14.235090 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:14.235057 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-36.ec2.internal_46c159a6e67f71d6ddbcaa845877ef38/haproxy/0.log" Apr 24 22:31:15.961966 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:15.961934 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4rgmg/must-gather-88m7q"] Apr 24 22:31:15.962445 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:15.962214 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-4rgmg/must-gather-88m7q" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="copy" containerID="cri-o://32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034" gracePeriod=2 Apr 24 22:31:15.963581 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:15.963555 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4rgmg/must-gather-88m7q"] Apr 24 22:31:15.964025 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:15.964001 2577 status_manager.go:895] "Failed to get status for pod" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" pod="openshift-must-gather-4rgmg/must-gather-88m7q" err="pods \"must-gather-88m7q\" is forbidden: User \"system:node:ip-10-0-129-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rgmg\": no relationship found between node 'ip-10-0-129-36.ec2.internal' and this object" Apr 24 22:31:16.189906 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.189880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rgmg_must-gather-88m7q_45c1557a-06c5-4185-a5bc-8da8018300fd/copy/0.log" Apr 24 22:31:16.190286 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.190272 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:31:16.192317 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.192294 2577 status_manager.go:895] "Failed to get status for pod" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" pod="openshift-must-gather-4rgmg/must-gather-88m7q" err="pods \"must-gather-88m7q\" is forbidden: User \"system:node:ip-10-0-129-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rgmg\": no relationship found between node 'ip-10-0-129-36.ec2.internal' and this object" Apr 24 22:31:16.232720 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.232673 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbrj4\" (UniqueName: \"kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4\") pod \"45c1557a-06c5-4185-a5bc-8da8018300fd\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " Apr 24 22:31:16.232790 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.232756 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output\") pod \"45c1557a-06c5-4185-a5bc-8da8018300fd\" (UID: \"45c1557a-06c5-4185-a5bc-8da8018300fd\") " Apr 24 22:31:16.234310 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.234282 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "45c1557a-06c5-4185-a5bc-8da8018300fd" (UID: "45c1557a-06c5-4185-a5bc-8da8018300fd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:31:16.234723 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.234695 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4" (OuterVolumeSpecName: "kube-api-access-zbrj4") pod "45c1557a-06c5-4185-a5bc-8da8018300fd" (UID: "45c1557a-06c5-4185-a5bc-8da8018300fd"). InnerVolumeSpecName "kube-api-access-zbrj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:16.286398 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.286366 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rgmg_must-gather-88m7q_45c1557a-06c5-4185-a5bc-8da8018300fd/copy/0.log" Apr 24 22:31:16.286713 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.286693 2577 generic.go:358] "Generic (PLEG): container finished" podID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerID="32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034" exitCode=143 Apr 24 22:31:16.286776 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.286746 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgmg/must-gather-88m7q" Apr 24 22:31:16.286816 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.286776 2577 scope.go:117] "RemoveContainer" containerID="32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034" Apr 24 22:31:16.288879 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.288853 2577 status_manager.go:895] "Failed to get status for pod" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" pod="openshift-must-gather-4rgmg/must-gather-88m7q" err="pods \"must-gather-88m7q\" is forbidden: User \"system:node:ip-10-0-129-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rgmg\": no relationship found between node 'ip-10-0-129-36.ec2.internal' and this object" Apr 24 22:31:16.294883 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.294864 2577 scope.go:117] "RemoveContainer" containerID="ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee" Apr 24 22:31:16.296818 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.296794 2577 status_manager.go:895] "Failed to get status for pod" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" pod="openshift-must-gather-4rgmg/must-gather-88m7q" err="pods \"must-gather-88m7q\" is forbidden: User \"system:node:ip-10-0-129-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rgmg\": no relationship found between node 'ip-10-0-129-36.ec2.internal' and this object" Apr 24 22:31:16.305905 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.305890 2577 scope.go:117] "RemoveContainer" containerID="32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034" Apr 24 22:31:16.306117 ip-10-0-129-36 kubenswrapper[2577]: E0424 22:31:16.306100 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034\": container with ID starting with 32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034 not found: ID does not exist" containerID="32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034" Apr 24 22:31:16.306161 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.306123 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034"} err="failed to get container status \"32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034\": rpc error: code = NotFound desc = could not find container \"32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034\": container with ID starting with 32c400ab654b47de5112a9645353a1d1f475210f709483c023f4a65057e11034 not found: ID does not exist" Apr 24 22:31:16.306161 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.306140 2577 scope.go:117] "RemoveContainer" containerID="ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee" Apr 24 22:31:16.306367 ip-10-0-129-36 kubenswrapper[2577]: E0424 22:31:16.306350 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee\": container with ID starting with ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee not found: ID does not exist" containerID="ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee" Apr 24 22:31:16.306419 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.306370 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee"} err="failed to get container status \"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee\": rpc error: code = NotFound desc = could not find container \"ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee\": container with ID starting with ee804704ffe7687516f746e818e102a1b86251ef4564671544c29efda14c1dee not found: ID does not exist" Apr 24 22:31:16.333688 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.333670 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/45c1557a-06c5-4185-a5bc-8da8018300fd-must-gather-output\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 22:31:16.333762 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.333690 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbrj4\" (UniqueName: \"kubernetes.io/projected/45c1557a-06c5-4185-a5bc-8da8018300fd-kube-api-access-zbrj4\") on node \"ip-10-0-129-36.ec2.internal\" DevicePath \"\"" Apr 24 22:31:16.970945 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.970901 2577 status_manager.go:895] "Failed to get status for pod" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" pod="openshift-must-gather-4rgmg/must-gather-88m7q" err="pods \"must-gather-88m7q\" is forbidden: User \"system:node:ip-10-0-129-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4rgmg\": no relationship found between node 'ip-10-0-129-36.ec2.internal' and this object" Apr 24 22:31:16.971319 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:16.971157 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" path="/var/lib/kubelet/pods/45c1557a-06c5-4185-a5bc-8da8018300fd/volumes" Apr 24 22:31:17.951224 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:17.951198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-hkjn8_d3bee81d-b2d9-4efd-8dd1-045747be92da/cluster-monitoring-operator/0.log" Apr 24 22:31:18.047267 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.047216 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-788b6459f7-2nk2j_7f441a5f-864e-4788-b82c-734df5988bbd/metrics-server/0.log" Apr 24 22:31:18.264436 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.264346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w6q5b_137ba05c-a1f1-46e3-b5fc-28b957ed5fc9/node-exporter/0.log" Apr 24 22:31:18.286903 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.286878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w6q5b_137ba05c-a1f1-46e3-b5fc-28b957ed5fc9/kube-rbac-proxy/0.log" Apr 24 22:31:18.309278 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.309237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w6q5b_137ba05c-a1f1-46e3-b5fc-28b957ed5fc9/init-textfile/0.log" Apr 24 22:31:18.444771 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.444730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/prometheus/0.log" Apr 24 22:31:18.461369 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.461345 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/config-reloader/0.log" Apr 24 22:31:18.485500 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.485475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/thanos-sidecar/0.log" Apr 24 22:31:18.507143 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.507124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/kube-rbac-proxy-web/0.log" Apr 24 22:31:18.530032 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.529966 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/kube-rbac-proxy/0.log" Apr 24 22:31:18.551930 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.551905 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/kube-rbac-proxy-thanos/0.log" Apr 24 22:31:18.581768 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.581738 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6bb4f5f-8724-4bdd-8ba1-d0b27efb5cbd/init-config-reloader/0.log" Apr 24 22:31:18.615298 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.615273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5hp6m_2260b4d6-e6d6-4354-8ff5-52f186f6fdba/prometheus-operator/0.log" Apr 24 22:31:18.645721 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.645699 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-5hp6m_2260b4d6-e6d6-4354-8ff5-52f186f6fdba/kube-rbac-proxy/0.log" Apr 24 22:31:18.685868 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.685841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-wkgcw_c93a2e52-5f47-430a-80f6-0b8d1ee5ab9a/prometheus-operator-admission-webhook/0.log" Apr 24 22:31:18.730498 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.730469 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b778d7999-4kq7h_a39ae09b-1beb-495e-8642-7487f193d5db/telemeter-client/0.log" Apr 24 22:31:18.757854 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.757829 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b778d7999-4kq7h_a39ae09b-1beb-495e-8642-7487f193d5db/reload/0.log" Apr 24 22:31:18.788334 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.788275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b778d7999-4kq7h_a39ae09b-1beb-495e-8642-7487f193d5db/kube-rbac-proxy/0.log" Apr 24 22:31:18.828603 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.828565 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/thanos-query/0.log" Apr 24 22:31:18.852305 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.852282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/kube-rbac-proxy-web/0.log" Apr 24 22:31:18.887539 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.887506 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/kube-rbac-proxy/0.log" Apr 24 22:31:18.907721 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.907699 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/prom-label-proxy/0.log" Apr 24 22:31:18.933353 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.933326 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/kube-rbac-proxy-rules/0.log" Apr 24 22:31:18.955258 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:18.955215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-754db4fb4f-dmkdh_bf620266-e2bc-4c7f-9f17-6e6bd7623500/kube-rbac-proxy-metrics/0.log" Apr 24 22:31:20.406788 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:20.406758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/1.log" Apr 24 22:31:20.414477 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:20.414449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-96wlk_b14cefbb-8e93-43c4-8a2d-f70afbe6cab4/console-operator/2.log" Apr 24 22:31:20.810704 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:20.810630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kvj2r_5701db21-73a5-4846-bd66-5cb8f4331749/download-server/0.log" Apr 24 22:31:21.168264 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc"] Apr 24 22:31:21.168675 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168661 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="copy" Apr 24 22:31:21.168722 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168679 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="copy" Apr 24 22:31:21.168722 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168701 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="gather" Apr 24 22:31:21.168722 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168707 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="gather" Apr 24 22:31:21.168809 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168756 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="gather" Apr 24 22:31:21.168809 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.168768 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="45c1557a-06c5-4185-a5bc-8da8018300fd" containerName="copy" Apr 24 22:31:21.172336 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.172319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.174526 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.174506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-scnl7\"/\"default-dockercfg-vdl5x\"" Apr 24 22:31:21.174636 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.174513 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"openshift-service-ca.crt\"" Apr 24 22:31:21.175420 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.175402 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"kube-root-ca.crt\"" Apr 24 22:31:21.180739 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.180719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc"] Apr 24 22:31:21.269886 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.269857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-podres\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.270056 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.269891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-lib-modules\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.270056 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.269915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-proc\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.270056 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.269930 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpnb\" (UniqueName: \"kubernetes.io/projected/aeb7f768-fae9-436d-9380-a269d1abc712-kube-api-access-sxpnb\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.270056 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.269956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-sys\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.370998 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.370964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-podres\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.370998 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-lib-modules\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-proc\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpnb\" (UniqueName: \"kubernetes.io/projected/aeb7f768-fae9-436d-9380-a269d1abc712-kube-api-access-sxpnb\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-sys\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-podres\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-lib-modules\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-proc\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.371199 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.371150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aeb7f768-fae9-436d-9380-a269d1abc712-sys\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.379431 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.379412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpnb\" (UniqueName: \"kubernetes.io/projected/aeb7f768-fae9-436d-9380-a269d1abc712-kube-api-access-sxpnb\") pod \"perf-node-gather-daemonset-xxwgc\" (UID: \"aeb7f768-fae9-436d-9380-a269d1abc712\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.482537 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.482440 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:21.601479 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:21.601449 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc"] Apr 24 22:31:21.604045 ip-10-0-129-36 kubenswrapper[2577]: W0424 22:31:21.604016 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaeb7f768_fae9_436d_9380_a269d1abc712.slice/crio-e3e487f98076bcc3a7861545e2fa6e2955b7523c2a571b5c7d40ddddcec9d542 WatchSource:0}: Error finding container e3e487f98076bcc3a7861545e2fa6e2955b7523c2a571b5c7d40ddddcec9d542: Status 404 returned error can't find the container with id e3e487f98076bcc3a7861545e2fa6e2955b7523c2a571b5c7d40ddddcec9d542 Apr 24 22:31:22.028208 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.028180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ttnhg_6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37/dns/0.log" Apr 24 22:31:22.047622 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.047594 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ttnhg_6f3bd325-dc2b-4af8-a5a7-0afd9a1fbd37/kube-rbac-proxy/0.log" Apr 24 22:31:22.069295 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.069266 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5fmr6_154e8a35-de7d-4d32-a077-f455b275faf2/dns-node-resolver/0.log" Apr 24 22:31:22.304667 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.304572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" event={"ID":"aeb7f768-fae9-436d-9380-a269d1abc712","Type":"ContainerStarted","Data":"6c067eba8c24fea9e1d1fe4e150d4030ff6f1d7b3ff3e946e2c529f1772e7538"} Apr 24 22:31:22.304667 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.304611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" event={"ID":"aeb7f768-fae9-436d-9380-a269d1abc712","Type":"ContainerStarted","Data":"e3e487f98076bcc3a7861545e2fa6e2955b7523c2a571b5c7d40ddddcec9d542"} Apr 24 22:31:22.304667 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.304643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:22.322103 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.322060 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" podStartSLOduration=1.322048777 podStartE2EDuration="1.322048777s" podCreationTimestamp="2026-04-24 22:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:22.320135099 +0000 UTC m=+3835.934405388" watchObservedRunningTime="2026-04-24 22:31:22.322048777 +0000 UTC m=+3835.936319056" Apr 24 22:31:22.610829 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:22.610763 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qx6tc_06abc9ab-6358-4dae-add4-0d288195411f/node-ca/0.log" Apr 24 22:31:23.306345 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:23.306316 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b4467575d-c7qhh_969ea49e-4e0c-48d5-9e89-dfddef64c993/router/0.log" Apr 24 22:31:23.676726 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:23.676670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zhh6t_def556d7-437a-4b70-b31e-6643ed89bc7e/serve-healthcheck-canary/0.log" Apr 24 22:31:24.190260 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:24.190223 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd6d8_d1f1f29b-6485-4944-a0a9-b2afb33787d9/kube-rbac-proxy/0.log" Apr 24 22:31:24.210542 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:24.210518 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd6d8_d1f1f29b-6485-4944-a0a9-b2afb33787d9/exporter/0.log" Apr 24 22:31:24.232698 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:24.232669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pd6d8_d1f1f29b-6485-4944-a0a9-b2afb33787d9/extractor/0.log" Apr 24 22:31:26.566056 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:26.566029 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-rqzzm_34d01e73-a26b-409a-9160-fb65661c3301/s3-init/0.log" Apr 24 22:31:26.589872 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:26.589844 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-qwgx9_2ecd9ac8-1162-499e-91fe-bf91a1c300a3/s3-tls-init-custom/0.log" Apr 24 22:31:28.317264 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:28.317213 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-xxwgc" Apr 24 22:31:30.675912 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:30.675884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-db2jz_18ef204f-6aa6-4107-8f8f-26e4ab42c428/migrator/0.log" Apr 24 22:31:30.697757 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:30.697732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-db2jz_18ef204f-6aa6-4107-8f8f-26e4ab42c428/graceful-termination/0.log" Apr 24 22:31:31.947079 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:31.947045 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9vwgn_28a670ce-fdb2-4872-af68-5a9ab19b64cc/kube-multus/0.log" Apr 24 22:31:32.132947 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.132917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:32.155964 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.155933 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/egress-router-binary-copy/0.log" Apr 24 22:31:32.179471 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.179403 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/cni-plugins/0.log" Apr 24 22:31:32.207310 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.207289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/bond-cni-plugin/0.log" Apr 24 22:31:32.230902 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.230881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/routeoverride-cni/0.log" Apr 24 22:31:32.251445 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.251423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/whereabouts-cni-bincopy/0.log" Apr 24 22:31:32.271706 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.271689 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gcwlm_8ba8ae41-c3c4-46dd-aecb-a8d704c38c1c/whereabouts-cni/0.log" Apr 24 22:31:32.637088 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.637058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jcztz_b0b45556-212a-460b-a5ae-108beeb6197d/network-metrics-daemon/0.log" Apr 24 22:31:32.656222 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:32.656169 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jcztz_b0b45556-212a-460b-a5ae-108beeb6197d/kube-rbac-proxy/0.log" Apr 24 22:31:33.724331 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.724304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-controller/0.log" Apr 24 22:31:33.741568 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.741542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/0.log" Apr 24 22:31:33.775141 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.775114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovn-acl-logging/1.log" Apr 24 22:31:33.797607 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.797577 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/kube-rbac-proxy-node/0.log" Apr 24 22:31:33.820360 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.820331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:33.836377 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.836352 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/northd/0.log" Apr 24 22:31:33.857369 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.857347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/nbdb/0.log" Apr 24 22:31:33.878052 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:33.878028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/sbdb/0.log" Apr 24 22:31:34.058892 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:34.058823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-csxqq_2054eaf6-5d96-49ee-86ed-e32bdb5b9ea0/ovnkube-controller/0.log" Apr 24 22:31:35.472958 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:35.472921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cnjcx_25d36037-41e8-4ba4-9072-939c3c9e4e19/network-check-target-container/0.log" Apr 24 22:31:36.378515 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:36.378391 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xtssb_02092214-a2c7-40c0-8e80-688f20002a35/iptables-alerter/0.log" Apr 24 22:31:37.004794 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:37.004764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4bxgg_82fe3a05-41ab-423c-aab1-343f07ea6c35/tuned/0.log" Apr 24 22:31:38.686413 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:38.686362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-gjg96_00ab5f90-95f6-4c68-b97c-a55985c40e09/cluster-samples-operator/0.log" Apr 24 22:31:38.706830 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:38.706792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-gjg96_00ab5f90-95f6-4c68-b97c-a55985c40e09/cluster-samples-operator-watch/0.log" Apr 24 22:31:40.288468 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:40.288433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-t8mq7_54bfba7f-bc92-446e-9646-877d96783afd/csi-driver/0.log" Apr 24 22:31:40.309986 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:40.309961 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-t8mq7_54bfba7f-bc92-446e-9646-877d96783afd/csi-node-driver-registrar/0.log" Apr 24 22:31:40.331163 ip-10-0-129-36 kubenswrapper[2577]: I0424 22:31:40.331138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-t8mq7_54bfba7f-bc92-446e-9646-877d96783afd/csi-liveness-probe/0.log"